When you choose to upload your images to iCloud (which currently happens without end-to-end encryption), your phone generates some form of encrypted ticket. In the future, the images will be encrypted, with a backdoor key encoded in the tickets.
If Apple receives enough images that were considered a match, the tickets become decryptable (I think I saw Shamir's Secret Sharing mentioned for this step). Right now, Apple doesn't need that because they have unencrypted images, in a future scheme, decrypting these tickets will allow them to decrypt your images.
(I've simplified a bit, I believe there's a second layer that they claim will only give them access to the offending images. I have not studied their approach deeply.)
These are not “claims.” The process by which they get access to only the safety vouchers for images matching CSAM is private set intersection and comes with a cryptographic proof.
In no step of the proposal does Apple access the images you store in iCloud. All access is through the associated data in the safety voucher. This design allows Apple to switch iCloud storage to end to end encrypted with no protocol changes.
The private set intersection is part of the protocol to shield Apple (and their database providers) from accountability, not to protect the users privacy.
They could instead send the list of hashes to the device (which they already must trust is faithfully computing the local hash) and just let the device report when there are hits. It would be much more CPU and bandwidth efficient, too.
The PSI serves the purpose that if Apple starts sending out hashes for popular lawful images connected to particular religions, ethnicity, or political ideologies that it is information theoretically impossible for anyone to detect the abuse. It also makes it impossible to tell if different users are being tested against different lists, e.g. if Thai users were being tested against political cartoons that insult the king.
The list of hashes is confidential. Good luck getting NCMEC to sign off on an implementation which lets clients infer which photos are matching their database.
The database is embedded into iOS. There are at least three primary sources which say that users will not receive different databases, and it should be easily confirmed.
I am well aware but that is exactly the point. If Apple can't provide an accountable implementation they should not implement this at all. This should be table stakes that all users should demand, at a minimum.
Otherwise there is no way to detect if the system is abused to target lawful activities.
The fancy crypto in the system isn't there to protect the user, it's to guard the system's implementer(s) against accountability. It protects Apple's privacy, not yours.
What good is end to end encryption if the OS is prebuilt with a method of breaking that encryption? This is definitional backdooring, and you’re back to trusting Apple’s goodwill (and/or willingness to resist governments) to keep your data safe (I.e., not add new decryptable triggers).
Not having backdoors is a hard requirement for end to end encryption offering privacy guarantees.
This is taking the discussion into the realm of hypothetical. If we end up in a world where there are reliable public cloud providers that offer end to end encryption with no content scanning whatsoever, I'll be glad to give them my money.
When you choose to upload your images to iCloud (which currently happens without end-to-end encryption), your phone generates some form of encrypted ticket. In the future, the images will be encrypted, with a backdoor key encoded in the tickets.
If Apple receives enough images that were considered a match, the tickets become decryptable (I think I saw Shamir's Secret Sharing mentioned for this step). Right now, Apple doesn't need that because they have unencrypted images, in a future scheme, decrypting these tickets will allow them to decrypt your images.
(I've simplified a bit, I believe there's a second layer that they claim will only give them access to the offending images. I have not studied their approach deeply.)