Not Swift Enough: Survivors and Children Wait for Government, Industry to Act shield arrow-simple-alt-top arrow-simple-alt-left arrow-simple-alt-right arrow-simple-alt-bottom facebook instagram linkedin medium pinterest rss search-alt twitter video-play arrow-long-right arrow-long-left arrow-long-top arrow-long-bottom arrow-simple-right arrow-simple-left arrow-simple-bottom readio arrow-simple-top speaker-down plus minus cloud hb pin camera globe cart rotate star edit arrow-top arrow-right arrow-left arrow-bottom check search close square speaker-up speaker-mute return play pause love

Not Swift Enough: Survivors and Children Wait for Government, Industry to Act

Late last month, sexually explicit deepfake images of Super Bowl-bound Taylor Swift exploded across the internet with tens of millions of views. One image was seen 47 million times before X removed it 17 hours later. While clearly not fast enough, the response was strong. X said it was "actively removing" the images and taking "appropriate actions" against the accounts involved in spreading them. The name "Taylor Swift" was temporarily unsearchable on X, alongside "Taylor Swift AI" and "Taylor AI." U.S. politicians called for “new laws to criminalise the creation of deepfake images.” Microsoft reported that it strengthened “existing safety systems to further prevent our services from being misused to help generate images like them” in the first place. Ensuing media coverage was intense.

Yet, broadly speaking, industry actions to address the rampant creation of images, recorded and live videos of real children being sexually abused, with or without the use of AI, pales in comparison. For example, according to Australia's eSafety Commissioner, tech companies reported that they do not detect, prevent, or address child sexual abuse material (CSAM) created and distributed “live” in video calls and livestreams.

In fact, a 2020 study by the International Justice Mission (IJM) found that children in the Philippines were sexually abused and exploited online for at least two years, on average. Another IJM study released in 2023 found that nearly half a million children in the Philippines were exploited in this way in 2022 alone. These children are abused in their homes while men, including Americans, around the world pay to direct and consume it live using the same popular social media and video chat apps we all use every day.

IJM has a robust program that, to date, has supported Philippine authorities to bring to safety over 1,200 victims and at-risk individuals, while supporting the government and key stakeholders to holistically strengthen the response to these crimes. And we will continue that critical work until Filipino children are protected from this violence. In fact, since 2022 through Project Boost, IJM’s Center to End Online Sexual Exploitation of Children has—in partnership with the U.S. National Center for Missing & Exploited Children (NCMEC) and Meta—begun training law enforcement in other countries to investigate cases of online sexual exploitation of children, including in Nigeria, Kenya, and elsewhere. We’re sharing our proven model from the Philippines to support other governments to protect their children too.

But to help reduce the massive scale of this harm, the private sector must do more to address these crimes happening on and through their apps and platforms. That “more” includes building their platforms safe by design to prevent harm, while also supporting efforts by organizations like IJM to strengthen law enforcement capacity to successfully investigate priority reports, bring children to safety, and hold offenders accountable.

Why should this matter to Americans? Because according to research from 2023 by the Philippine Anti-Money Laundering Council, payments flagged by the financial sector as “suspicious transactions” for child sexual exploitation in the Philippines chiefly originated from the United States, followed by the U.K., Australia and Canada. And because U.S. tech company platforms are global, creating clear rules of the road and accountability for the U.S. tech sector will translate to a safer internet for children everywhere.

We might feel overwhelmed or confused about how to protect people – children and adults – from hands-on and AI-generated exploitation, but there's hope. Numerous common-sense bills are pending in Congress, waiting for politicians to vote on how the U.S. will rise to the challenge of child protection online. With this legislative backdrop, on Jan. 31, the U.S. Senate Judiciary Committee received hours of testimony from five major tech CEOs, demanding that they do more to protect children from abuse online and stem the tide of CSAM.

What the hearing made abundantly clear is that U.S.-based multinational tech platforms are not built safe by design at their core; and to become safe, the industry as a whole needs U.S. laws now. Deepfake pornography that harms everyday citizens and celebrities like Taylor Swift, AI-generated CSAM, and livestreamed child sexual abuse are all prime examples for why Congress must require companies to embed safety into their products before they roll out technology. Because safety by design is not mainstream across the tech sector, popular video call and livestreaming apps are easily weaponized by American sex offenders to livestream child sexual abuse – and much more could be done to identify them and hold them accountable.

In summary, all companies should use safety technology to prevent as much online exploitation as possible, while timely and robustly reporting harm when it happens so law enforcement can do its job.

Legislation to protect people – whether celebrities like Taylor Swift or ordinary citizens – from harmful deepfake images is simply a must. Protecting all children and survivors from the ongoing trauma of sexual abuse and exploitation, including CSAM, is past due and urgently needed to ensure a strong industrywide response. The time for change is now.

By John Tanagho, Executive Director, IJM Center to End Online Sexual Exploitation of Children
Feb. 8, 2024

You might also be interested in…

see more

Media Contact

We're here to answer your questions. Please fill out the form below and someone from our team will follow up with you soon.

More Information

Petra Kooman

Director of Marketing and Public Relations
pkooman@ijm.ca
519.679.5030 x.229

Make an Impact

Your skills, talents, and ideas are a force for change. From birthday parties to polar dips, your fundraising campaign can stop the violence.

Learn More

Thank you for signing up to learn more about starting a fundraiser. We will be in touch soon!

In the meantime, please take a look at our free guide: 25 Tips for the Novice Fundraiser.

Need Help?

Need more information?
We're here to help.
Contact us at events@ijm.ca

Test

Test