Artificial intelligence (AI) technology has become widely available, allowing the average person to create fake images indistinguishable from the real thing. While there are many innocent uses of this technology (such as funny memes), this technology holds serious potential for harm and abuse, including spreading disinformation, damaging reputations, and sexually exploiting people.
The use of AI to create deepfake pornography represents one of these harmful realities. This article reviews where the law stands when it comes to criminalizing deepfake pornography at the federal and state levels and the challenges faced by prosecutors and victims.
Deepfake pornography refers to sexually explicit content that's created, altered, or manipulated using AI technology without the subject's permission. To the average user, deepfake videos and images are indistinguishable from the real thing. AI technology makes it possible to create not only realistic images but also realistic speech for an individual.
Deepfake pornography made news headlines when videos of celebrities, like Taylor Swift and Scarlett Johansson, surfaced. Anyone can be a victim of deepfake pornography, but most victims are women and girls.
Posting deepfake pornography is now a crime under federal law and most states' laws. While the details of these laws vary, generally speaking, they prohibit malicious posting or distributing AI-generated sexual images of an identifiable person without their consent. Some laws require proof that the defendant intended to harass, harm, or intimidate the victim. Many of these laws impose harsher penalties when the victim is a child.
The federal TAKE IT DOWN Act became law in May 2025. This law makes non-consensual publication of authentic or deepfake sexual images a felony. Threatening to post such images is also a felony if the defendant did so to extort, coerce, intimidate, or cause mental harm to the victim.
The federal law refers to deepfakes as "digital forgeries" of identifiable adults or minors showing nudity or sexually explicit conduct. These forgeries cover images created or altered using AI or other technology when a reasonable person would find the fake indistinguishable from the real thing.
For criminal prosecutions involving deepfake images of adults, federal prosecutors must show that the defendant meant to cause or did cause financial, psychological, or reputational harm to the victim. When the image shows a minor, the prosecutor must show that the defendant published the image to humiliate or harass the victim or to arouse sexual desires of any person.
Penalties for publishing deepfake pornography range from 18 months to three years of federal prison time, plus fines and forfeiture of property used to commit the crime. The harshest penalties apply when the image is of a child.
(S.R. 146 (2025).)
More than half of the states have enacted laws prohibiting deepfake pornography. Some states created new laws specifically targeting deepfakes, while others expanded existing crimes to cover these acts. States that enacted new crimes typically modeled them after their revenge porn laws. Some specifically reference "deepfakes," but most broadly define images and videos to include those created, modified, or altered by AI or similar technology to depict an identifiable person.
While these laws generally aim to criminalize the same type of images, they vary in their penalties and proof of harm to secure a conviction. For instance, some laws require the prosecutor to prove that the defendant shared or published the deepfake sexual images intending to harm the victim financially or emotionally. A few states' laws include reputational injuries as a type of harm. Other states focus on the defendant's intent to harass, intimidate, or coerce a victim.
Penalties range from misdemeanors to serious felonies. Several states impose both misdemeanor and felony penalties, although the latter might only apply if the defendant posts the image to a website, the victim is a minor, or the victim suffers certain harm.
Below are examples of state laws that may criminalize creating or sharing deepfake pornography.
California makes it a crime to create and distribute computer-generated sexually explicit images that appear authentic when the defendant intends to cause serious emotional distress to the person depicted in the image. (Cal. Penal Code § 647(j)(4) (2025).)
Florida makes it a third-degree felony to willfully and maliciously publish, post, or share an altered sexual depiction of an identifiable person without consent. An "altered sexual depiction" is any visual depiction that's modified, altered, or adapted to represent a realistic version of an identifiable person. It's also a crime to possess or create AI-generated child pornography. (Fla. Stat. §§ 827.072, 836.13 (2025).)
Georgia law makes it a crime to electronically transmit or post an image or video depicting nudity or sexually explicit conduct of an adult, including a "falsely created videographic or still image," when the defendant intends to harass or cause financial loss to the depicted person. Posting the video or image to a website or file server carries felony penalties. (Ga. Code § 16-11-90 (2025).)
Hawaii considers it a first-degree privacy invasion to create, disclose, or threaten to disclose fictitious, composite images depicting a person engaging in sexual conduct. To be convicted, the prosecutor must prove the defendant intended to substantially harm the depicted person's health, safety, career, business, financial condition, or reputation or acted out of revenge or retribution. This offense is a class C felony. It's also a crime to possess images that violate this law. (Haw. Rev. Stat. § 711-1110.9 (2025).)
Indiana makes it a crime to distribute or post, without consent, "intimate images" depicting sexual conduct or nudity. Intimate images include computer-generated images created using AI or a computer program that appear to depict the alleged victim. (Ind. Code § 35-45-4-8 (2025).)
Louisiana's law on "Unlawful Deepfakes" makes it a felony to:
Penalties range from 5- to 30-year prison sentences. For crimes involving minors, a mandatory 5- or 10-year sentence applies. (La. Rev. Code § 14:73.13 (2025).)
Minnesota makes it a crime to intentionally and without consent distribute a deepfake image that realistically depicts an identifiable person's intimate parts or an identifiable person engaging in a sexual act. A violation is a gross misdemeanor. The penalty increases to a felony if the defendant:
Repeat offenses are also felonies. (Minn. Stat. § 617.262 (2025).)
New York targets deepfake porn through its revenge porn laws. The state expanded its current law to prohibit the nonconsensual distribution of a sexually explicit image of another, including images created or altered by digitization. For a conviction, the prosecutor must prove the defendant intended to harm the emotional, financial, or physical welfare of the depicted person. (N.Y. Penal Code § 245.15 (2025).)
North Carolina has misdemeanor and felony penalties for unlawful disclosure of private sexual images, including those created or altered by AI, without affirmative consent from the identifiable, depicted person. Criminal penalties apply if the defendant disclosed the image with the intent of coercing, harassing, intimidating, demeaning, humiliating, or causing financial loss to the depicted person or to cause others to do so. (N.C. Gen. Stat. § 14-190.5A (2025).)
South Dakota makes it a crime to knowingly sell or share, without consent, any image or recording of a person that's been manipulated to create a realistic but false image depicting the person as nude or engaging in a sexual act. To secure a conviction, the prosecution only needs to establish that the defendant's intent was self-gratification. A violation is a class 1 misdemeanor. However, felony penalties apply if the victim is younger than 18 and the defendant was 21 or older when the recording was made. (S.D. Codified Laws § 22-21-4 (2025).)
Texas imposes class A misdemeanor penalties for unlawfully creating or distributing a deepfake video that appears to depict a person engaged in sexual conduct or with intimate parts exposed. (Tex. Penal Code § 21.165 (2025).)
Utah law prohibits the unlawful distribution of a counterfeit intimate image, defined as any visual depiction or computer-generated image created, edited, manipulated, or altered to depict the likeness of an identifiable individual. A person commits an offense by knowingly distributing a counterfeit image without the depicted person's knowledge and knowing it will cause emotional harm or distress to the depicted person. Felony and misdemeanor penalties apply. (Utah Code § 76-5b-205 (2025).)
Virginia expanded its revenge porn law to include nude or partially nude images "created by any means whatsoever" and distributed without authorization. The law applies only if the person maliciously shares or sells the image with the intent to coerce, harass, or intimidate the depicted person. (Va. Code § 18.2-386.2 (2025).)
Washington enacted a new crime called "disclosing fabricated intimate images." The law applies to digitized sexual images created or altered using AI. To be convicted, the person must knowingly disclose the image without consent to cause harm to the depicted person. (Wash. Rev. Code § 9A.86.030 (2025).)
Even with new laws targeting deepfake porn, these cases are often difficult to prosecute. Just finding the perpetrators can be a difficult task, requiring considerable manhours, financial resources, and expensive technology. Plus, many laws require a prosecutor to prove the perpetrator intended some harm to the depicted person, when often the perpetrator's only objective is self-gratification.
Some federal and state prosecutors have turned to child pornography and obscenity laws to go after people who make and post deepfake sexual images of children. These laws don't require prosecutors to prove the defendant intended to harm the child victim. However, these laws present their own challenges for prosecution, especially in light of a 2002 U.S. Supreme Court decision—Ashcroft v. Free Speech Coalition. In Ashcroft, the Court held that virtual child pornography can't be banned because no actual children are harmed by it. To do so would violate a person's First Amendment rights.
Since 2002, the legal landscape and AI technology have evolved significantly. Congress and many states changed their laws in an effort to respond to the Ashcroft decision. And AI changed the virtual world. Prosecutors are using these new and updated laws to file charges against defendants for AI-generated child pornography and obscenity. But it might take some time for courts to sort through the legality of these laws as they relate to deepfake images.
(18 U.S.C. §§ 1466A, 2252, 2252A, 2256(8) (2025); Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002); U.S. v. Anderegg, 24-CR-50-JDP (W.D. Wis. Feb. 13, 2025).)
Another possible option for victims is to take the civil route. If they know who made or posted the image, they may be able to seek damages in a civil lawsuit. Possible civil causes of action include invasion of privacy, intentional infliction of emotional distress, or actions specific to nonconsensual distribution of deepfake porn.
Several states—including Alabama, California, Florida, Illinois, Minnesota, and South Dakota—have laws that allow victims whose images have been unlawfully used in deepfake porn to seek money damages, ask for court orders directing defendants to take down materials from websites, or both.
The new federal TAKE IT DOWN Act also provides civil remedies. Starting in the summer of 2026, victims will be able to submit requests to websites and platforms to have their images removed. Website administrators must take down the image within 48 hours of receiving the request. This option doesn't require the victim to know who the perpetrator is.
(Ala. Code § 6-5-840; Cal. Civil Code § 1708.85, Fla. Stat. § 836.13, 740 Ill. Comp. Stat. 190/5 and following; Minn. Stat. § 604.32; S.D. Codified Laws § 22-21-4 (2025).)
If your image has been used in deepfake pornography, help may be available. These organizations provide information and resources to assist victims remove sexually explicit online images or stop the sharing of them—check out Take It Down, Help from NCMEC, the Safety Center at CyberCivilRights.org, National Center on Sexual Exploitation, or the FBI's Internet Crime Complaint Center.
When a student uses AI to generate deepfake images of other students or teachers, your school may be able to address the issue through Title IX, the federal law banning sex-based harassment in education. (20 U.S.C. §§ 1681-1689 (2025).)
If you have questions regarding crimes associated with deepfake pornography, talk to a criminal defense lawyer. For questions regarding civil remedies (money damages, removal options), a personal injury lawyer may be able to assist you.
Need a lawyer? Start here.