Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My problem with all of this is not some abstract slippery slope of "what happens if China starts insisting that Tank Man be added to the hash list" or whatever.

It's the practical scenarios where people in the United States have previously been indicted and their lives ruined over "child pornography" or "statutory rape" that stretches the definition to the breaking point. Young men just barely 18 years old have gone to jail because they have taken pictures of their technically under-aged girlfriends. Often their real crime is that they've simply upset someone rich and powerful, or that they were guilty of being black and dating a blonde white girl. I wish I was being facetious, but this happens all the time.

More importantly, in this digital age, many stupid teenagers upload nude pictures of their under-aged girlfriends to the Internet. Sometimes as revenge for the girl breaking up with them, sometimes because they "hacked" a girl's account and downloaded their selfies, or they legitimately have the photos and just wanted those sweet internet karma points.

Imageboards like 4chan are full of what is technically CP, even though nobody was directly harmed in its creation. Other stupid teenagers download these photos, as do older paedophiles that trawl these websites looking for CP. Some of them will share it with other paedophiles, internationally even.

So what happens if one of these paedophiles gets arrested for a serious child sex crime? Their PCs and their phones will be searched, and the pictures will end up on the CP hash database, of course. Even with this "multiple nations" requirement, that just requires two arrests for the hash to be added. Nothing really changed, there's just a delay.

Now what will happen to the stupid teenagers that downloaded the same photo? Teenagers that aren't child molesters! Teenagers that aren't paedophiles!

Apple will probably say: No worries, we filter out the phones belonging to under-aged children from the results.

... until they turn 18.

Get it? This is a legal landmine in the pocket of every stupid kid. YOUR stupid kid, that looks at porn on the internet, just like every other stupid kid.

This is just one scenario where this could lead to a false positive that totally and irrevocably destroys someone's life. I can think of several more, and dozens once you start adding the political pressure from various other nations.

There is just no way to make this kind of dragnet surveillance safe. Even a tiny false positive rate is unacceptable when there are hundreds of millions of people playing this distopian lottery.



This is not a risk unique to Apple’s program to scan for known CSAM. Every cloud provider is susceptible.

Furthermore, I am fairly sure NCMEC’s due diligence process involves confirming the depicted individual was abused.


NCMEC and similar groups call all child pornography CSAM.[1] Including 17 year olds sexting each other. And several people said NCMEC's database includes legal images.[2][3] What made you fairly sure NCMEC's process was more rigorous?

[1] https://www.missingkids.org/theissues/csam

[2] https://www.hackerfactor.com/blog/index.php?/archives/929-On...

[3] https://news.ycombinator.com/item?id=28069844


This is a highly informative comment. Thanks!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: