Home Technology Google Search, Microsoft Bing seem to have a deepfake porn ‘problem’ – Times of India

Google Search, Microsoft Bing seem to have a deepfake porn ‘problem’ – Times of India

0
Google Search, Microsoft Bing seem to have a deepfake porn ‘problem’ – Times of India

[ad_1]

Deepfake is a rampant problem and it seems like it is quite easily accessible on popular search engine. In recent investigations, it has come to light that nonconsensual deepfake pornography is easily accessible through popular search engines such as Google and Microsoft’s Bing. Deepfake pornography involves superimposing an individual’s face onto explicit content, creating deceptive and often disturbing scenarios.
According to a report by NBC News, the news outlet discovered that deepfake pornographic images featuring the likenesses of female celebrities were prominently displayed in search results for many women’s names when combined with terms like “deepfakes,” “deepfake porn,” or “fake nudes.” These searches were conducted with safe-search tools turned off to evaluate the unfiltered content.
In an examination of 36 popular female celebrities on both Google and Bing, results showed that nonconsensual deepfake images and links to deepfake videos surfaced prominently in the top search results for 34 searches on Google and 35 searches on Bing. Over half of the top results directed users to a popular deepfake website or a competing platform, noted the report.
The report mentions that searching “fake nudes” on Google returned links to various apps and programs designed for creating and viewing nonconsensual deepfake porn within the first six results. These were followed by six articles highlighting instances of high school students allegedly utilising such technology to generate and distribute deepfake nude images of their female classmates. Bing, on the other hand, provided dozens of results related to nonconsensual deepfake tools and websites before presenting an article discussing the harms associated with this disturbing phenomenon, according to the report.


What Google, Microsoft have to say

A Google spokesperson told NBC News in a statement: “We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search. Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for. As this space evolves, we’re in the process of building more expansive safeguards, with a particular focus on removing the need for known victims to request content removals one-by-one.”
A Microsoft spokesperson in a statement to NBC said, “The distribution of non-consensual intimate imagery (NCII) is a gross violation of personal privacy and dignity with devastating effects for victims. Microsoft prohibits NCII on our platforms and services, including the soliciting of NCII or advocating for the production or redistribution of intimate imagery without a victim’s consent.”



[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here