Unfortunately, we're in a bit of an echo chamber here. The average person is either unaware of these types of issues, doesn't care, or can be easily swayed with "think of the children"-type arguments.
There's going to be absolutely no oversight or transparency into occasions when an image is removed from a device. Nobody will ever know when a non-CSAM image is accidentally pulled back from a device. All the public will ever see are headlines to the tune of "CSAM scanning system catches bad person once again".
This is a really awful path that we're going down, and this is absolutely going to get abused by regimes around the world.
There's going to be absolutely no oversight or transparency into occasions when an image is removed from a device. Nobody will ever know when a non-CSAM image is accidentally pulled back from a device. All the public will ever see are headlines to the tune of "CSAM scanning system catches bad person once again".
This is a really awful path that we're going down, and this is absolutely going to get abused by regimes around the world.