Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hashes just detect old unmodified content, not new stuff, or even AI generated stuff.


Wasn’t the hashing mechanism that Apple was proposing using capable of dealing with some attempts to obfuscate the CSAM image with edits?


It's also super easy to change the video slightly to alter the hash.


Obviously, but the idea of creating a useful hash function in this case is to make it invariant under those sorts of to a human trivial transformations, whilst avoiding collisions between genuinely different content. Microsoft created something called PhotoDNA that tries to do this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: