Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My big issue is what it opens up. As the EFF points out, it's really not a big leap for oppressive governments to ask Apple to use the same tech (as demoed by using MS's tech to scan for "terrorist" content) to remove content they don't like from their citizens' devices.


That's my concern: what happens the first time a government insists that they flag a political dissident or symbol? The entire system is opaque by necessity for its original purpose but that seems to suggest it would be easy to do things like serve a custom fingerprints to particular users without anyone being any the wiser.


My heart goes to the queer community of Russia, whose government will pounce on this technology in a heartbeat and force Apple to scan for queer content.


They’d have many other countries keeping them company, too.

One big mess: how many places would care about false positives if that gave them a pretext to arrest people? I do not want to see what would happen if this infrastructure had been available to the Bush administration after 9/11 and all of the usual ML failure modes played out in an environment where everyone was primed to assume the worst.


First, standard disclaimer on this topic that there were multiple independent technologies announced - I assume you are speaking to content hash comparisons on photo upload specifically to Apple's photo service, which they are doing on-device vs in-cloud.

How is this situation different from an oppressive government "asking" (which is a weird way we now use to describe compliance with laws/regulations) for this sort of scanning in the future?

Apple's legal liability and social concerns would remain the same. So would the concerns of people under the regime. Presumably the same level of notification and ability of people to fight this new regulation would also be present in both cases.

Also, how is this feature worse than other providers which already do this sort of scanning on the other side of the client/server divide? Presumably Apple does it this way so that the photos remain encrypted on the server, and release of data encryption keys is a controlled/auditable event.

You would think the EFF would understand that you can't use technical measures to either fully enforce or successfully defeat regulatory measures.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: