Take it Down Act, the law that criminalizes deepfakes and ‘revenge porn’
New tools make it easy to create ‘deepfakes’ - images or videos that convincingly show someone naked or in a sexual situation
In an unusual bipartisan consensus, President Donald Trump has signed the Take It Down Act, a groundbreaking federal law that criminalizes the non-consensual distribution of sexually explicit images, real or generated by artificial intelligence. The law has been hailed as a major victory for digital security, especially for women and minors who are victims of the growing threat of deepfakes (sexual images of real people created with artificial intelligence) and revenge porn.
What is the Take It Down Act?
The Take It Down Act is the first U.S. federal law specifically designed to address the dissemination of non-consensual intimate images (NCII), including AI-generated deepfakes. The law makes it illegal to share sexually explicit images or videos of a person without their consent, regardless of whether the images are real or manipulated with artificial intelligence.
The law also imposes a new burden on technology platforms. Websites, apps and social networks are now required to remove offensive content within 48 hours of receiving a valid takedown request from a victim. Failure to comply can result in enforcement by the Federal Trade Commission (FTC), and individuals who knowingly distribute such images can face up to three years in prison or heavy financial penalties.
The new law is a response to concerns about how technology, especially AI, enables the dissemination of explicit content without consent. New tools make it all too easy to create deepfakes: images or videos that convincingly show someone naked or in a sexual situation. Targets have ranged from children and teenagers to politicians, educators and celebrities such as Taylor Swift, whose AI-manipulated images recently went viral.
One survey highlighted that 6% of young people have been victims of such abuse, and 10% know someone who has been. Victims often suffer lasting trauma, including anxiety, depression, job loss and, in some cases, even suicide.
Victims
The passage of the law has featured the stories of survivors like Ellison Berry, a Texas high school student. One of her classmates used an app to create nude deepfakes from images taken from her Instagram account and shared them on Snapchat. After repeated failed attempts to have the content removed, Berry’s mother turned to Senator Ted Cruz, who co-sponsored the bill with Senator Amy Klobuchar.
Berry became the face of the movement and was invited to the White House for the signing ceremony. First Lady Melania Trump, who rarely appears publicly in Washington political debates, personally lobbied Congress in support of the bill and was present at the signing.
Who does it protect?
Although the law is particularly focused on protecting minors, its implications are broader. Professors, politicians, CEOs and ordinary women have become frequent targets of explicit deepfakes, often created to damage victims’ reputations or force them out of public life.
A recent study by the American Sunlight Project found that 25 of the 26 lawmakers targeted with deepfake pornography were women. Female public figures are 70 times more likely to be victims of non-consensual deepfakes.
These types of images have also become a form of workplace harassment. Victims have reported that their bosses receive fake nude images or that co-workers circulate them internally, pushing them out of their jobs with no recourse.
Support and opposition
The bill passed almost unanimously in both houses of Congress, with only two dissenting votes in the House of Representatives. Major tech companies such as Meta, Google, Snapchat and TikTok backed the measure, as did more than 100 nonprofit organizations.
However, some advocates warn that the law could be abused to censor legitimate adult content or silence critics. They also point to the lack of safeguards against bogus takedown requests. Rep. Thomas Massie, one of the bill’s detractors, described it as a “slippery slope ripe for abuse.”
Despite these concerns, most lawmakers and advocacy groups argue that the urgency of the threat - and the harm it causes - justifies the law.
What can a victim do?
If someone is a victim, the following recommendations are made:
- Report promptly. Platforms are now legally required to act within 48 hours, so it’s best to act immediately
- Search for reporting tools. Sites such as Instagram, TikTok and Snapchat already have content removal request forms
- Save evidence. Screenshots and links are useful for complaints and possible legal action
- Contact law enforcement. With the new law in place, victims may now have a clearer path to press charges
- Turn to trusted partners. Organizations such as StopNCII.org can help victims file on multiple platforms at once.