Google and Microsoft's Bing- two popular search engines have reportedly been showing non-consensual deepfake adult videos (porn) at the top of search results along with the tools that advertise the ability to create such kind of content. In instances of non-consensual deepfake pornography, individuals' appearances are manipulated digitally to create a false impression of their involvement in explicit sexual activities.
According to NBC News' analysis, searches combining women's names with terms such as "deepfakes," along with broader phrases like "deepfake porn" or "fake nudes," yielded troubling results on Google and other prominent search engines. The top search results prominently featured deepfake pornographic images utilizing the likenesses of female celebrities.
The researchers used Google and Bing to search for 36 well-known female celebrities using a mix of their names and the term "deepfakes".
Upon reviewing the results, it was discovered that the top Google and top Bing results for 34 and 35 of those queries, respectively, contained links to deepfake videos and non-consensual deepfake photos.
Over half of the top results were links to a popular deepfake website or a competitor, the report mentioned.
Searching for "fake nudes" showed links to numerous applications and programmes to create and observe nonconsensual deepfake pornography.
These links were among the first six results on Google.
In a search for "fake nudes" on Bing, a number of nonconsensual deepfake websites and tools appeared.
The report found that users could view and create this type of pornographic content before any news reports explained the harmful nature of non-consensual deepfakes.
In Google and Bing, victims of deepfake porn can request the removal of the content by filling out a form.
The report highlighted the apparent lack of regular monitoring by search engines, including Google and Microsoft's Bing, to detect and address misuse of their search platforms.
"We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search," a Google spokesperson was quoted as saying. "Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for," it added.
In recent times, deepfake videos of Bollywood stars like Rashmika Mandanna, Alia Bhatt, Priyanka Chopra, Katrina Kaif, etc went viral.
ALSO READ Infinix Smart 8 HD Review: A budget-friendly smartphone under the entry-level segment
Inputs from IANS