Newsletter

Jan 29, 2024

Will Taylor Swift Force People to Care About Deepfakes?

Person walking forward

Nude deepfakes of Taylor Swift made the rounds late last week on X (formerly Twitter), Meta, and Telegram. Hundreds of explicit and sometimes violent images depicted the singer-songwriter in a series of compromising and appalling contexts without her consent, reaching a mass audience that, by our last conservative estimate, numbered in hundreds of millions of users. The images remained online well after making the news, and though minimal efforts have been made by X to further contain this type of content, it’s still available elsewhere (as nothing truly disappears from the internet forever).

The humiliation of women (and those presenting as women) is now a primary use of deepfakes on the greater web. Over 96% of deepfake images currently online are of a pornographic nature, and 99% of such content targets women. In the first nine months of 2023, the presence of deepfake pornography online rose by 54% compared to 2022. With easy access to faceswap and deepfake generation services — including those advertised on X — creating this malicious content is easier than ever. While the exploitation of women is older than the internet by eons, deepfakes present a new, horrifying method for malicious actors to propagate such abuse anonymously, without punishment, and decidedly without consent. 

This is because to date, only ten U.S. states have managed to pass effective laws to prosecute creators and distributors of deepfake pornography. No federal laws exist to criminalize this type of abuse, nor are there any required measures for online platforms to search or moderate for such content.

Meanwhile, incidents involving non-consensual deepfake pornography are appearing at an alarming and quickening rate — like the thirty students at Westfield High School, NJ, who were victims of deepfake pornography created by their classmates. (Due to the absence of applicable laws, prosecutors can only charge the perpetrators with cyber harassment). Or the case in Levitton, NY, where teenage girls were left to conduct their own investigation of pornographic deepfake photos posted on a fetish site. 

Or the case in Perth, Australia.

And the case in Spain.

And the cases that are scarcely reported by underfunded local newspapers around the world, or go unreported at all, while still causing lasting, irreparable harm for victims of these deepfakes.

The vast majority of these cases — thousands of them — almost always conclude with no justice for victims, though the horrifying high-profile deepfakes featuring Taylor Swift may finally compel governments to enact effective legislation to criminalize this behavior. Renewed calls for effective deepfake laws in U.S. Congress raise hopes that stalled bills might move forward amid public uproar. The escalation in deepfake pornography and its violations of women’s safety, dignity, and consent is a bipartisan issue, after all, one that requires a powerful, unified response from our government. Statements from both sides of the aisle have been encouraging, yet comprehensive legislation to outlaw deepfake pornography is already years behind the technology, and our representatives’ failure to act quickly will only hurt more people.

By scanning the disturbing content in question, Reality Defender was able to determine with over 90% confidence that the Taylor Swift deepfakes were created with a diffusion model, an A.I.-driven technology accessible through thousands of apps and publicly available models. Our deepfake detection tools can be easily integrated into any content moderation workflow, empowering social media platforms to scan, flag, and remove explicit deepfake content in real time before it can spread beyond the reach of human moderator teams. Reality Defender’s research experts are constantly studying new and hypothetical generative AI models, staying ahead of the tools and methods reprehensible individuals use to create abusive deepfakes. 

Reality Defender can help solve the non-consensual deepfake pornography problem today at the highest level. Yet there’s no legal incentive for platforms to lift a finger and make even a halfhearted attempt in doing so.

Only legislation can fix this. It’s disheartening that, after thousands of ordinary women facing abuse and harassment, it may only take deepfaking the most famous woman of all for the rest to be taken seriously.

\ Solutions by Industry
Reality Defender’s purpose-built solutions help defend against deepfakes across all industries
Subscribe to the Reality Defender Newsletter