Contact Us on the WhatsApp icon to send us a message or email

McIntyre Report Political Talk Show

The Vladimir Putin Interview

Recent News

The next 3 minutes will transform your life forever.

Get our free News Emails on latest articles, alerts and solutions for both legal templates and ways to help fight back against the Globalists vax Mandates , and health resources to boost your immune system and ways to Protect from deadly EMF 5G radiation and more.


Australian National Review - News with a Difference!

How you can advertise on

Help us help defend free speech and save democracy from the World Economic Forum planned Totalitarian Great Reset. and help us expose the Covid Fraudsters

Fake Taylor Swift nudes ‘alarming’ – White House — RT World News


The AI-generated images were reportedly viewed over 47 million times before the account that posted them on X was suspended

Sexually-explicit ‘photos’ of pop singer Taylor Swift that spread on social media platforms earlier this week were “alarming,” White House spokesperson Karine Jean-Pierre told reporters during a news briefing on Friday, promising action against nonconsensual AI porn was forthcoming.

“We’re going to do what we can to deal with this issue,” Jean-Pierre said, adding that Congress should pass legislation and social media platforms should crack down on the sharing and posting of such images.

“While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing, enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people,” she said.

One of the images, posted on X, was viewed over 47 million times before the account posting it was suspended, according to the New York Times. X claimed it was working to remove the images and has suspended several of the accounts that posted them. The platform’s terms of service prohibit the sharing of AI-generated images of real people, pornographic or otherwise. 

A search for ‘Taylor Swift’ on X returned an error on Saturday afternoon. A Swift fan on the platform said the search term had been “banned”. However, it was still possible to use “Taylor Swift” in a search, as long as another word or words came first – whether “protect” or “AI generated.” 

Some of the images were originally posted to a Telegram group devoted to “non-consensual AI-generated sexual images of women” on Thursday, according to tech blog 404 Media. Others had been floating around on trolling mecca 4chan and other forums for weeks before the crossover into ‘mainstream’ social media took them viral. 

Many of the images were created using Microsoft’s Designer, a commercially-available AI text-to-image generator. The Telegram group explains to the uninitiated how to circumvent Microsoft’s own protections against celebrity deepfakes and porn, outlining how, whereas the program will not generate an image in response to the prompt “Taylor Swift,” it will respond agreeably to one like “Taylor singer Swift.” 

Microsoft told the blog it was “investigating these reports and…taking appropriate action to address them,” pointing out that its terms of service forbid using the programs to create “adult or non-consensual intimate content.”

US lawmakers introduced a bill in the House of Representatives earlier this month aimed at federally controlling the use of AI for audio and video deepfakes. The No Artificial Intelligence Fake Replies and Unauthorized Duplications Act (NO AI FRAUD Act) is reportedly based on the similar Senate bill Nurture Originals, Foster Art, and Keep Entertainment Safe Act (NO FAKES Act), which allows celebrities to sue anyone who creates “digital replicas” of their image or voice without permission.

You can share this story on social media:

Source link

Original Source

Related News

Let’s not lose touch…Your Government and Big Tech are actively trying to censor the information reported by The ANR to serve their own needs. Subscribe now to make sure you receive the latest uncensored news in your inbox…