Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They could have done all that without telling you. And as long as the traffic was combined with normal traffic no one would ever notice (and in this case it would end up mixed with normal traffic since it only applies to images being uploaded to iCloud, so communication with Apples servers would be expected).

What it looks like to me is that Apple is planning on releasing end-to-end encryption for iCloud. But they know that whenever E2EE comes up, people get mad that terrorists, child molesters, and mass shooters can hide their data and communications. Hell, they've been painted as the villain when they say they can't unlock iPhones for the FBI. This heads off those concerns for the most common out of those crimes.



Sticking to the apt analogy from the article, >To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.

> Think of it this way: Your landlord owns your property, but in the United States, he cannot enter any time he wants. In order to enter, the landlord must have permission, give prior notice, or have cause. Any other reason is trespassing. Moreover, if the landlord takes anything, then it's theft. Apple's license agreement says that they own the operating system, but that doesn't give them permission to search whenever they want or to take content.

This viewpoint is like thanking your landlord for warning you that they are going to enter your home and root through your private items, all in the name of some greater good. Let's not spin it as if the landlord is doing us a favor in this scenario.


That first line of the quote is misrepresenting what Apple is doing. They are not copying files from your device. You are sending them the files. As it stands the only images that will be scanned are the ones you are uploading to iCloud. And I'd be shocked if they weren't already analyzing those images on the server side.

When it comes to governments being able to pressure them into being more invasive, nothing has changed with this update. If a government wanted to poison the CSAM database, they could have already. You'd end up reported when the server does the scanning. If the government wanted to expand scanning to include things that you're not uploading, they already could have asked Apple to do that. It would have been possible to silently add a much simpler scanning mechanism or data exfiltration into an update.

This isn't a spin to say anyone is doing us a favor. iCloud should be end-to-end encrypted and there shouldn't be any scanning at all. But why should that opinion on how we should treat privacy be taken as the only valid opinion? The people who do want the scanning are not simply asking for it because they are stupid or uninformed. Instead they put different weights into what they value.


> What it looks like to me is that Apple is planning on releasing end-to-end encryption for iCloud

This is Gruber's optimistic take on it as well. If so, why not make both changes at once? Given that they've walked back E2EE on iCloud before, I'm not holding my breath.


> They could have done all that without telling you.

But in that case it would much more likely be a crime, it would certainly cost them a tremendous amount of good will.

Your personal computing device is a trusted agent. You cannot use the internet without it, and esp. in lockdown you likely can't realistically live your life without use of the internet. You share with it your most private information, more so even than you do with your other trusted agents like your doctor or lawyers (whom you likely communicate with using the device). Its operation is opaque to you: you're just forced to trust it. As such your device ethically owes you a duty to act in your best interest, to the greatest extent allowed by the law. -- not unlike your lawyers obligation to act in your interest.

Apple is reprogramming customer devices, against the will of many users (presumably at the cost of receiving necessary fixes and security updates if you decline) to make it betray that trust and compromise the confidentiality of the device's user/owner.

The fact that Apple is doing it openly makes it worse in the sense that it undermines your legal recourse for the betrayal. The only recourse people have is the one you see them exercising in this thread: Complaining about it in public and encouraging people to abandon apple products.

E2EE should have been standard a decade ago, certainly since the Snowden revelations. No doubt apple seeks to gain a commercial advantage by simultaneously improving their service while providing some pretextual dismissal of child abuse concerns. But this gain comes at the cost of deploying and normalizing an automated surveillance infrastructure, one which undermines their product's ethical duty to their customers, and one that could be undetectable retasked to enable genocide by being switched to match on images associated with various religions, ethniticities, or political ideologies.


Eh, I think it is simply the fact that Apple doesn’t want to be associated with individuals violating their terms of service in an unlawful way.

This is a way to root them out and report them to law enforcement.


If this is a prelude to E2E encryption for iCloud they are going to be under TREMENDOUS pressure from law enforcement to expand the list of bad material way beyond just CSAM.


Not if it's only E2EE for photos


They might be able to do it without anyone ever knowing, but at a business level they can’t they do it better with everyone knowing and children safety as an excuse?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: