pickles55 t1_jacidkh wrote
I really hope this tool doesn't get expanded for use against all sexually explicit images. Not to be that guy but invasions of individual privacy and freedom of speech are typically used first against people that most would have little sympathy for, like child abusers and violent criminals, before being used on everyone else. The first criminals Obama deported under his new rules were pedophiles but it wasn't long before they were deporting grandmother's because they were the receptionist for a white collar criminal. This tool in it's present form seems great but it is very powerful and could do a lot of damage in the wrong hands
HanaBothWays t1_jactbkc wrote
This tool is an expansion of the existing tool used to detect and take down CSAM (Child Sexual Abuse Material). Dedicated adult content sites like Onlyfans and Pornhub also use that tool. They may adopt this expansion as well if it works out on the other platforms that are early adopters, since they don’t want any stuff with minors and/or anything the subjects of the uploaded media did not consent to on their site (it’s against their policy).
Expanding this to filter out any adult content whatsoever would be very difficult because it only works on “known” media, that is, media for which there is a hash already uploaded to the database. These tools can’t recognize “hey, that’s a naked child/teenager” or “hey, that’s a boob.” They can only recognize “that media matches a signature in my database.”
Viewing a single comment thread. View all comments