Google works with UK nonprofits Stopncii Strengthen its efforts to combat the spread of involuntary intimate images, also known as revenge porn.
Search giants will start using Stopncii’s hashThis is a digital fingerprint of images and videos to actively identify and delete involuntarily intimate images when searching.
Stopncii helps adults prevent their private images from sharing online by creating unique identifiers or hashes that represent their intimate images. These hashes are then provided to partner platforms like Facebook, allowing them to automatically identify and delete matching content from their platform or service.
It is worth noting that the private image itself never leaves the device, as only the hash is uploaded to Stopncii’s system.
“Our existing tools enable people to Request for deletion “Search for NCII for NCII and we continue to launch ranking improvements to reduce visibility into such content,” Google wrote in a blog post. “We are also hearing from survivors and advocates that given the size of the open network, there is more to relieve the burden on those affected by it.”
Google adopts Stopncii’s system slow because of its partnership with nonprofits a year later Microsoft integrates tools into Bing. Other companies working with StopNCII include Facebook, Instagram, Tiktok, Reddit, Bumble, Snapchat, Snepchat, OnlyFans, X, and more.
The search giant’s partnership with nonprofits marks its latest move to crack down on involuntary private images. Last year, Google Making deletion easier Deep hits involuntary private images in search, making them harder to find.