Pro@programming.dev to Technology@lemmy.worldEnglish · 2 days agoGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googleexternal-linkmessage-square16fedilinkarrow-up181arrow-down16file-textcross-posted to: technology@beehaw.org
arrow-up175arrow-down1external-linkGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googlePro@programming.dev to Technology@lemmy.worldEnglish · 2 days agomessage-square16fedilinkfile-textcross-posted to: technology@beehaw.org
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down4·2 days agoBecaue hashes are known to work great with images 🤦♂️
minus-squaregian @lemmy.grys.itlinkfedilinkEnglisharrow-up6·2 days agoThey say to use PDQ for images which should output a similar hash for similar images (but why MD5 for video ?). So probably it is only a threshold problem. The algorithm is explained here https://raw.githubusercontent.com/facebook/ThreatExchange/main/hashing/hashing.pdf it is not an hash in the cryptographic sense.
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up2arrow-down2·1 day agoThere was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly
Becaue hashes are known to work great with images 🤦♂️
They say to use PDQ for images which should output a similar hash for similar images (but why MD5 for video ?). So probably it is only a threshold problem.
The algorithm is explained here
https://raw.githubusercontent.com/facebook/ThreatExchange/main/hashing/hashing.pdf
it is not an hash in the cryptographic sense.
There was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly