Image-based sexual abuse (IBSA), including the non-consensual distribution of intimate images (NCII), is an exponentially growing issue. Private online reporting and removal tools, such as the Take It Down service run by the NCMEC, can empower victim-survivors, especially young people, who experience threats of the sharing of their intimate images. By pre-emptively reporting images, users could ideally block them from ever being posted on multiple major online platforms with one report, taking power away from perpetrators of sextortion and protecting against the reuploading of known IBSA content.
These services rely on sharing “perceptual hash values” (like digital fingerprints) of images with online platforms in order to match IBSA content without sharing the images/videos themselves. However, our research shows that generative AI attacks using consumer-grade hardware can be used to approximately reconstruct images from their hash value, known as “hash inversion”. This indicates that the hash values should be treated as carefully as the original images, otherwise vulnerable users’ privacy may be put at risk, for example if perceptual hash values of reported intimate images became public as the result of a data breach.
To mitigate this attack, we propose implementing Private Set Intersection (PSI) as an additional layer of protection, to enhance the security and privacy for users whilst maintaining the functionality required to detect and remove IBSA. We highlight the future potential for private pre-emptive reporting to combat sextortion threats, and the need for user-focused design and greater transparency in IBSA reporting and removal tools.