Is Deepfake Pornography Illegal?

Learn where the law stands when it comes to criminalizing deepfake porn at the federal and state levels and the challenges faced by prosecutors and victims.

By , Attorney Mitchell Hamline School of Law
Updated by Stacy Barrett, Attorney UC Law San Francisco
Updated 2/04/2025

Artificial intelligence (AI) technology has become widely available, allowing the average person to create fake images indistinguishable from the real thing. While there are many innocent uses of this technology (such as funny memes), this technology holds serious potential for harm and abuse, including spreading disinformation, damaging reputations, and sexually exploiting people.

The use of AI to create deepfake pornography represents one of these harmful realities. This article reviews where the law stands when it comes to criminalizing deepfake pornography at the federal and state levels and the challenges faced by prosecutors and victims.

What Is Deepfake Pornography?

Deepfake pornography refers to sexually explicit content created, altered, or manipulated, without the depicted person's consent, using AI or similar technology. To the average user, deepfake videos and images are indistinguishable from the real thing. AI technology makes it possible to not only create realistic images but also realistic speech of an individual.

Deepfake pornography made news headlines when videos of celebrities, like Taylor Swift and Scarlett Johansson, surfaced. But anyone can be a victim of deepfake pornography, although most victims are women and girls.

Is Deepfake Pornography Illegal?

It can be. Several states have enacted laws prohibiting deepfake pornography. Some created new laws specifically targeting deepfakes, while others expanded existing crimes to cover these acts. States that enacted new crimes typically modeled them after their revenge porn laws. Some specifically reference "deepfakes," but most broadly define images and videos to include those created, modified, or altered by AI or similar technology to depict an identifiable person.

No Federal Law; States Take Action

There is currently no federal law that specifically addresses deepfake pornography, although if the images depict a minor, federal child pornography laws may apply. Senator Ted Cruz has reintroduced Take It Down, a federal bill that would criminalize the publication of deepfake pornography and require platforms to remove the images.

Without federal action, many states have passed laws prohibiting the creation, possession, or distribution of deepfake images. In states that don't have crimes specific to deepfake porn, existing laws that may cover such acts include criminal harassment, stalking, cyberstalking, and invasion of privacy. As with federal law, when deepfake images depict minors, prosecutors may be able to file charges using state child pornography laws.

Below, we review state efforts to criminalize deepfake porn and how current child pornography laws may apply to deepfake images of minors.

What Law Enforcement and Prosecutors Are Up Against

Unfortunately, even with new laws targeting deepfake porn, these cases are often difficult to prosecute. Just finding the perpetrators can be a difficult task, requiring considerable manhours, financial resources, and expensive technology. Plus, many laws require a prosecutor to prove the perpetrator intended some harm to the depicted person, when often the perpetrator's only objective is self-gratification.

Child Pornography Laws and Deepfake Images of Minors

Federal law and all 50 states make it illegal to create, sell, distribute, or possess child pornography that depicts actual minors engaging in sexual conduct. Many of these laws extend to images created, adapted, or modified to depict an identifiable minor, which arguably includes deepfakes and AI-generated images. Internet child porn can be prosecuted at the federal and state levels.

Federal Law Prohibiting Deepfake Child Pornography

Federal law prohibits the possession, distribution, and creation of child pornography. The definition of child pornography includes any visual depiction of a minor engaging in sexually explicit conduct, including computer-generated images that are indistinguishable from an actual minor or that have been created, adapted, or modified to depict an identifiable, actual minor.

Penalties for federal child pornography convictions are harsh, carrying prison sentences that range from 5 to 40 years, plus stiff fines. Mandatory minimum sentences may apply, and having prior child pornography or child sexual abuse convictions results in increased penalties.

(18 U.S.C. §§ 2252, 2252A, 2256(8) (2025).).

State Laws Prohibiting Deepfake Child Pornography

State laws also prohibit creating, sharing, and possessing child pornography. Many of these laws mirror the federal law and include visual depictions of an identifiable child that has been created, adapted, or modified using AI or other computer software. Some states have gone further to include completely AI-generated images.

Adapted or modified images. Virginia law defines "child pornography" as sexually explicit visual material involving an identifiable minor, including visual depictions that are adapted or modified. Minnesota's laws prohibiting possession and creation of child pornography specifically include a "computer-generated image or picture… [that] has been created, adapted, or modified to appear that an identifiable minor is engaging in sexual conduct." (Minn. Stat. §§ 617.246, 617.247; Va. Code §§ 18.2-374.1, 18.2-374.1:1 (2025).)

New technologies; adding AI. Some states' laws may not be broad enough to cover new technologies, and lawmakers have been rushing to push through legislation with new or amended language. For example, California added "digitally altered" and "AI-generated matter" to its child pornography laws. Alabama created a new term for child pornography—child sexual abuse materials—that includes visual depictions of a child engaged in sexual conduct, including "virtually indistinguishable" depictions that are computer or digitally generated. (Ala. Code §§ 13A-12-190, 13A-12-197; Cal. Penal Code §§ 311, 311.1, 311.2, 311.3 (2025).)

Covering multiple bases. Other states are doubling down to ensure that their laws prohibit deepfake child pornography. Texas law, for example, defines child pornography to include visual depictions. But lawmakers decided to go a step further and enacted language clarifying that the law includes the use of recognizable images of an actual child produced using AI technology or other computer software. Florida updated its child pornography law to include images that were created or altered to portray an identifiable minor. On top of that, it enacted a new crime for "generated child pornography" that applies to a fictitious but real-looking child. North Carolina took an approach similar to Florida's. (Fla. Stat. §§ 827.071, 827.072; N.C. Gen. Stat. §§ 14-190.13, 14-190.17C; Tex. Penal Code § 43.26 (2025).)

Like federal law, state penalties for child pornography often include lengthy prison sentences and stiff fines.

State Criminal Laws Prohibiting Deepfake Pornography

Several states have jumped in quickly to create new laws prohibiting the creation, possession, or distribution of deepfake images depicting nudity or sexual conduct. The laws vary considerably, especially regarding proof of a defendant's intent. Many require the prosecutor to establish that the defendant intended to harm or harass the victim in some way, which can be difficult to prove. State lawmakers are also grappling with defining "deepfake" images so the laws cover ever-advancing technology. For instance, language that targets "altered or modified images" might not cover AI technology that can "create" deepfake images.

Below are state laws that may criminalize creating or sharing deepfake pornography. Many state legislatures are working to update their laws to more effectively combat the growing problem of deepfake porn.

California makes it a crime to create and distribute computer-generated sexually explicit images that appear authentic when the defendant intends to cause serious emotional distress to the person depicted in the image. (Cal. Penal Code § 647(j)(4) (2025).)

Florida makes it a third-degree felony to willfully and maliciously publish, post, or share an altered sexual depiction of an identifiable person without consent. An "altered sexual depiction" is any visual depiction that's modified, altered, or adapted to represent a realistic version of an identifiable person. It's also a crime to possess or create AI-generated child pornography. (Fla. Stat. §§ 827.072, 836.13 (2025).)

Georgia law makes it a crime to electronically transmit or post an image or video depicting nudity or sexually explicit conduct of an adult, including a "falsely created videographic or still image," when the defendant intends to harass or cause financial loss to the depicted person. Posting the video or image to a website or file server carries felony penalties. (Ga. Code § 16-11-90 (2025).)

Hawaii considers it a first-degree privacy invasion to create, disclose, or threaten to disclose fictitious, composite images depicting a person engaging in sexual conduct. To be convicted, the prosecutor must prove the defendant intended to substantially harm the depicted person's health, safety, career, business, financial condition, or reputation or acted out of revenge or retribution. This offense is a class C felony. It's also a crime to possess images that violate this law. (Haw. Rev. Stat. § 711-1110.9 (2025).)

Indiana will make it a crime to distribute or post, without consent, "intimate images" depicting sexual conduct or nudity. Intimate images include computer-generated images created using AI or a computer program that appears to depict the alleged victim. (Ind. Code § 35-45-4-8 (2025).)

Louisiana's law on "Unlawful Deepfakes" makes it a felony to:

  • knowingly create, possess, sell, or distribute deepfake material depicting a minor engaging in sexual conduct, or
  • knowingly sell, distribute, or exhibit deepfake material of an adult engaged in sexual conduct without their consent.

Penalties range from 5- to 30-year prison sentences. For crimes involving minors, a mandatory 5- or 10-year sentence applies. (La. Rev. Code § 14:73.13 (2025).)

Minnesota makes it a crime to intentionally and without consent distribute a deepfake image that realistically depicts an identifiable person's intimate parts or an identifiable person engaging in a sexual act. A violation is a gross misdemeanor. The penalty increases to a felony if the defendant:

  • distributes the deepfake intending to profit from it
  • posts the deepfake on a website
  • shares the deepfake to harass the depicted individual, or
  • causes the depicted individual to suffer financial harm.

Repeat offenses are also felonies. (Minn. Stat. § 617.262 (2025).)

New York targets deepfake porn through its revenge porn laws. The state expanded its current law to prohibit the nonconsensual distribution of a sexually explicit image of another, including images created or altered by digitization. For a conviction, the prosecutor must prove the defendant intended to harm the emotional, financial, or physical welfare of the depicted person. (N.Y. Penal Code § 245.15 (2025).)

North Carolina has misdemeanor and felony penalties for unlawful disclosure of private sexual images, including those created or altered by AI, without affirmative consent from the identifiable, depicted person. Criminal penalties apply if the defendant disclosed the images with the intent of coercing, harassing, intimidating, demeaning, humiliating, or causing financial loss to the depicted person or to cause others to do so. (N.C. Gen. Stat. § 14-190.5A (2025).)

South Dakota makes it a crime to knowingly sell or share, without consent, any image or recording of a person that's been manipulated to create a realistic but false image depicting the person as nude or engaging in a sexual act. To secure a conviction, the prosecution need only establish that the defendant's intent was self-gratification. A violation is a class 1 misdemeanor. However, felony penalties apply if the victim is younger than 18 and the defendant was 21 or older when the recording was made. (S.D. Codified Laws § 22-21-4 (2025).)

Texas. A person commits a class A misdemeanor in Texas by unlawfully creating or distributing a deepfake video that appears to depict a person engaged in sexual conduct or with intimate parts exposed. (Tex. Penal Code § 21.165 (2025).)

Utah law prohibits the unlawful distribution of a counterfeit intimate image, defined as any visual depiction or computer-generated image created, edited, manipulated, or altered to depict the likeness of an identifiable individual. A person commits an offense by knowingly distributing a counterfeit image without the depicted person's knowledge and knowing it will cause emotional harm or distress to the depicted person. Felony and misdemeanor penalties apply. (Utah Code § 76-5b-205 (2025).)

Virginia expanded its revenge porn law to include nude or partially nude images "created by any means whatsoever" and distributed without authorization. The law applies only if the person maliciously shares or sells the image with the intent to coerce, harass, or intimidate the depicted person. (Va. Code § 18.2-386.2 (2025).)

Washington enacted a new crime called "disclosing fabricated intimate images." The law applies to digitized sexual images created or altered using AI. To be convicted, the person must knowingly disclose the image without consent to cause harm to the depicted person. (Wash. Rev. Code § 9A.86.030 (2025).)

Civil Remedies for Deepfake Pornography

If criminal remedies don't apply, or even if they do, it's possible a victim could seek damages in a civil lawsuit. Possible civil causes of action include invasion of privacy, intentional infliction of emotional distress, or actions specific to nonconsensual distribution of deepfake porn.

Several states—including Alabama, California, Florida, Illinois, Minnesota, and South Dakota—have laws that allow victims whose images have been unlawfully used in deepfake porn to seek money damages, ask for court orders directing defendants to take down materials from websites, or both.

When students use AI to generate deepfake images of other students or teachers, schools can address the issue through Title IX, the federal law banning sex-based harassment in education.

(Ala. Code § 6-5-840; Cal. Civil Code § 1708.85, Fla. Stat. § 836.13, 740 Ill. Comp. Stat. 190/5 and following; Minn. Stat. § 604.32; S.D. Codified Laws § 22-21-4; 20 U.S.C. §§ 1681-1689 (2025).)

Resources for Victims of Deepfake Porn

If your image has been used in deepfake pornography, help may be available. These organizations provide information and resources to assist victims remove sexually explicit online images or stop the sharing of them—check out Take It Down, Help from NCMEC, the Safety Center at CyberCivilRights.org, National Center on Sexual Exploitation, or the FBI's Internet Crime Complaint Center.

Should I Talk to a Lawyer?

What law applies (and if it does) depends on several factors, such as how the images were created, obtained, or distributed, what the defendant did with the images, what the defendant's intent was in making, sharing, or possessing the images, and how old the victim in the image was.

States haven't taken a uniform approach to this issue, and the laws and penalties regarding deepfake porn vary considerably. If you have questions regarding crimes associated with deepfake pornography, talk to a criminal defense lawyer. For questions regarding civil remedies (money damages, removal options), a personal injury lawyer may be able to assist you.

Get Professional Help
Talk to a Sex Crime attorney.
How It Works
  1. Briefly tell us about your case
  2. Provide your contact information
  3. Choose attorneys to contact you

Talk to a Lawyer

Need a lawyer? Start here.

How it Works

  1. Briefly tell us about your case
  2. Provide your contact information
  3. Choose attorneys to contact you