Google is trying to remove and bury explicit deepfakes from search results

Google is dramatically stepping up its efforts to combat the appearance of AI-generated explicit images and videos in search results. The company wants to make it clear that non-consensual AI-produced deepfakes are not welcome on its search engine.

The real images may be lewd or otherwise offensive, but regardless of the specifics, Google has a new approach to removing this type of material and burying it far from the first-page results if deletion isn’t possible. Notably, Google has experimented with using its own artificial intelligence to generate images for search results, but those images don’t include real people and especially nothing risqué. Google partnered with experts on the subject and those who have been victims of nonconsensual deepfakes to make its response system more robust.

scroll to top