Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What is your suggested solution?

Your options are:

1. Leave up child porn, revenge porn, hate speech, gore, fake and misleading stuff, etc for a long time, possibly forever, since a machine can no longer automatically take down mass-reported content

OR

2. Hire so many human moderators that facebook goes bankrupt

OR

3. Use untrained and unpaid/underpaid human moderators, such as strangers (the reddit model); go back to square 1 where volunteer moderators take down stuff they don't like with minimal oversight

OR

4. Have so few users that a small number of human moderators can actually review every report, i.e. the hacker news model. Or actually, no, hacker news "dead"s posts based on just number of reports without a human seeing it, I take that back. I guess this is the approach forums, most smaller blog comments, and other quite small websites use.

Do you have another solution that isn't one of those? Do any of those solutions sound good for facebook? Better than what they have now?



Here's a possible system: replace "random people" with "people with a solid track record of previous correct reports". When something gets reported, human moderators categorize it as either "correctly reported, and should be taken down", "incorrectly reported, but possibly a misunderstanding or a borderline case", or "a clearly bad-faith report against content that no reasonable person would actually believe breaks the rules". Keep track of how many reports from each user end up in each category.

If almost all of your reports are in the first category, then you're considered trusted, and if enough trusted users report a post, then it can be automatically removed before a human moderator sees it. If you haven't reported anything before, or too many of your reports are in the second category, then your report only helps to get the submission in front of a human moderator and doesn't directly contribute to it being removed. If more than a handful of your reports ever end up in the third category, you get banned for abusing the report system.


Is that not already the case? Do you think facebook doesn't already weight reports by the historical quality of the reporter?

That still seems like a violation of "No content should ever be taken down automatically just because a bunch of random people report it", as you wrote above.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: