A child safety group pushed Apple on why the announced CSAM detection feature was abandoned, and the company has given its most detailed response yet as to why it backed off its plans.
I think Apple’s ultimate decision on this is the correct one. The world is an ugly place and there’s no silver bullet that solves a problem like CSAM and ensures it can’t be abused.
Wish it weren’t as this likely would have made a huge impact against child abusers, but thankfully degrading every Apple user’s privacy isn’t the only effective way to fight against it.
Exactly. As problematic as CSAM is, adding the ability for on device scanning is exactly what untrustworthy governments would sell their souls for.
The potential to scan everyone’s devices for any content a government deems problematic could shift the balance of power in the world permanently. You can see why they want it so much.
I think Apple’s ultimate decision on this is the correct one. The world is an ugly place and there’s no silver bullet that solves a problem like CSAM and ensures it can’t be abused.
Wish it weren’t as this likely would have made a huge impact against child abusers, but thankfully degrading every Apple user’s privacy isn’t the only effective way to fight against it.
Exactly. As problematic as CSAM is, adding the ability for on device scanning is exactly what untrustworthy governments would sell their souls for.
The potential to scan everyone’s devices for any content a government deems problematic could shift the balance of power in the world permanently. You can see why they want it so much.
Not just governments either.