Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The issue is whether it's a good idea to design computers to allow certain actions in the first place. Since fundamentally we can't design computers to detect 'bad' search queries

Where is the design of computers coming into this? If you host a forum, and someone posts something that you don't like, you have the option to delete that content.

Google hosts millions of online videos, and decides that they will delete child pornography when they find it (whether they use an automatic filter or a system of moderation is irrelevant).

They also decide that they won't delete ISIS videos when they discover them. That decision is not a technical one, but a social one. It is worth talking about, even if, as the article says:

> Certainly the fact that there are 3000 ISIS videos on YouTube and 10,000 ISIS accounts on Twitter should give you pause. Clearly this is a tricky area, and I don’t believe this is necessarily a matter for government regulation. I do, however, think that Google might alter its “Don’t be evil” motto to “Don’t enable evil.”



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: