Submitted by goki7 t3_11dj1ey in technology
HanaBothWays t1_ja916ts wrote
I suspected that this would basically work like the tools used to recognize and spike Child Sexual Abuse Material (CSAM) images and it actually is - it’s the same tools and the same database! This is basically expanding the eligibility criteria for what can go into the database.
Previously if you sent your high school sweetheart a nude selfie and that person did whatever with it, you didn’t have a lot of options, but now you can upload a hash of the picture (not the actual picture) to the database and it will get taken down.
Also if you are a legal adult now but have nude photos of yourself from when you were a minor floating around, you can upload hashes fo the database and have them taken down.
Viewing a single comment thread. View all comments