Technology

Google bans advertisers from promoting deepfake porn services

Google has long banned sexually explicit ads — but until now, the company hasn’t banned advertisers from advertising Services which allows people to make deepfake porn and other forms of generated acts. That will change soon.

Google at the moment prohibits advertisers from promoting “sexually explicit content,” which Google defines as “text, images, audio or video content containing graphic sexual acts intended to arouse.” The new guidelines now also prohibits the promotion of services that help users create this type of content, whether by changing a person’s image or generating a new one.

The change, which will take effect on May 30, prohibits “the promotion of synthetic content that has been modified or generated to be sexually explicit or contain nudity,” such as: B. Websites and apps that instruct people how to create deepfake porn.

“This update is intended to explicitly prohibit advertising for services that offer the creation of deepfake pornography or synthetic nude content,” explains Google spokesman Michael Aciman The edge.

Aciman says it will remove any ads that violate its policies, adding that the company uses a combination of human reviews and automated systems to enforce those policies. According to the company, Google removed over 1.8 billion ads in 2023 for violating its sexual content policies annual advertising safety report.

The change was first reported by 404 media. As 404 points out that while Google has already banned advertisers from promoting sexually explicit content, some apps that facilitate the creation of deepfake pornography have gotten around this by presenting themselves as non-sexual in Google ads or the Google Play Store have applied. For example, a face-swapping app advertised itself as sexually explicit not on the Google Play Store, but on porn sites.

Non-consensual deepfake pornography has become an ongoing problem in recent years. Two Florida middle school students were arrested last December for allegedly creating AI-generated nude photos of their classmates. Just this week it was a 57-year-old man from Pittsburgh sentenced to more than 14 years in prison for possession of fake child sexual abuse material. Last year, the FBI issued a notice about a “surge” in blackmail schemes in which people were blackmailed with AI-generated nude photos. While many AI models make it difficult – if not impossible – for users to create AI-generated nudes, some services Let users generate sexual content.

There could soon be legislative action against deepfake porn. Last month, the House and Senate introduced the DEFIANCE Actwhich would establish a process through which victims of “digital counterfeiting” could sue people who create or distribute non-consensual deepfakes of them.

Source link