[ad_1]
Google has issued a stark warning to advertisers: cease affiliations with the darker elements of artificial intelligence. The tech giant has come down hard on the advertising of deepfake pornography, which involves using AI to alter or distort a person’s image to portray them in sexual activities. Google has given advertisers a strict deadline to cut all ties with websites engaged in this unethical practice.
The company’s new advertising guidelines target “sites or apps that claim to create deepfake pornography, provide instructions on creating such content, or endorse or compare deepfake pornography services.” Speaking to The Verge, Google spokesperson Michael Aciman stated, “This update is to explicitly prohibit advertisements for services offering deepfake pornography or synthetic nude content.”
Starting May 30, any ads that fall within these criteria will be prohibited as per the updated policy. Google had a strict warning for those who break this rule: “If we find violations of this policy, we will suspend your Google Ads accounts without prior warning, and you will be completely barred from advertising with us again.”
In 2021, Google eliminated close to 2 billion ads that violated sexual content policies. Deepfakes have become an increasingly common problem online, even plaguing teenagers. In Los Angeles, warnings have been issued to parents about deepfake videos of students in circulation. Prominent figures like Taylor Swift, Jenna Ortega, and Congresswoman Alexandria Ocasio-Cortez have also fallen victim to deepfake misconduct.
[ad_2]