Microsoft Provides Deepfake Porn Victims with Image Removal Tool for Bing Search Results

The rise of generative AI technology has introduced a significant challenge for the internet: the widespread creation of synthetic nude images that closely resemble real individuals. On Thursday, Microsoft made a significant move to empower victims of revenge porn by enhancing its Bing search engine with tools designed to prevent these explicit images from appearing in search results.

Microsoft has partnered with StopNCII, an organization that enables victims of revenge porn to generate a digital fingerprint of explicit images on their devices, whether real or altered. This digital fingerprint, known as a "hash," is utilized by StopNCII’s partners to eliminate the image from their platforms. By collaborating with StopNCII, Microsoft’s Bing joins a group of major platforms including Facebook, Instagram, Threads, TikTok, Snapchat, Reddit, Pornhub, and OnlyFans, all committed to curbing the spread of revenge porn using these fingerprints.

In a recent blog post, Microsoft revealed that it has already addressed 268,000 explicit images surfaced through Bing’s image search in a pilot program that ran through August, utilizing StopNCII’s database. While the company previously provided a direct reporting tool, it has recognized that this method alone is insufficient.

“We have heard from victims, experts, and other stakeholders that user reporting alone may not effectively scale to achieve meaningful impact or adequately mitigate the risk of these images being accessed via search,” Microsoft stated in its Thursday blog update.

Given the scale of the issue, it’s easy to imagine how much more daunting this challenge could be for a more widely used search engine like Google. Google offers its own tools for reporting and removing explicit images from search results; however, it has faced criticism from former employees and victims for not partnering with StopNCII, as highlighted in a Wired investigation. Since 2020, Google users in South Korea have flagged 170,000 search and YouTube links for unwanted sexual content, Wired reported.

The problem of AI-generated deepfake nudes is already pervasive. StopNCII’s resources are limited to individuals aged 18 and older, but “undressing” websites are increasingly problematic for high school students across the nation. Unfortunately, the United States lacks a comprehensive law addressing AI deepfake porn, resulting in a fragmented system of state and local regulations to tackle the issue.

In August, prosecutors in San Francisco initiated a lawsuit aimed at shutting down 16 of the most notorious “undressing” sites. According to a tracker for deepfake porn legislation compiled by Wired, 23 states in the U.S. have enacted laws addressing nonconsensual deepfakes, while nine states have rejected related proposals.

Most people like

Find AI tools in YBX