Is Deepfake Pornography Illegal?

Learn where the law stands when it comes to criminalizing deepfake porn at the federal and state levels and the challenges faced by prosecutors and victims.

By , Attorney · Mitchell Hamline School of Law
Updated March 19, 2024

Artificial intelligence (AI) technology has become widely available, allowing the average person to create fake images indistinguishable from the real thing. While there are many innocent uses of this technology (such as funny memes), this technology holds serious potential for harm and abuse, including spreading disinformation, damaging reputations, and sexually exploiting people.

The use of AI to create deepfake pornography represents one of these harmful realities. This article reviews where the law stands when it comes to criminalizing deepfake pornography at the federal and state levels and the challenges faced by prosecutors and victims.

What Is Deepfake Pornography?

Deepfake pornography refers to sexually explicit content created, altered, or manipulated, without the depicted person's consent, using AI or similar technology. To the average user, deepfake videos and images are indistinguishable from the real thing. AI technology makes it possible to not only create realistic images but also realistic speech of an individual.

Deepfake pornography made news headlines when videos of celebrities, like Taylor Swift and Scarlett Johansson, surfaced. But anyone can be a victim of deepfake pornography, although most victims are women and girls.

Is Deepfake Pornography Illegal?

It can be. Several states have enacted laws prohibiting deepfake pornography. Some created new laws specifically targeting deepfakes, while others expanded existing crimes to cover these acts. States that enacted new crimes typically modeled them after their revenge porn laws. Some specifically reference "deepfakes," but most broadly define images and videos to include those created, modified, or altered by AI or similar technology to depict an identifiable person.

There's no federal law specifically addressing deepfake pornography, although if the images depict a minor, federal child pornography laws may apply. In states that don't have crimes specific to deepfake porn, existing laws that may cover such acts include criminal harassment, stalking, cyberstalking, and invasion of privacy. These laws can pose hurdles for prosecutors, though, when they require proof of repeated acts or that the perpetrator intended to harass or harm the victim. As with federal law, when deepfake images depict minors, prosecutors may be able to file charges using state child pornography laws.

Below, we review state efforts to criminalize deepfake porn and how current child pornography laws may apply to deepfake images of minors. Unfortunately, even with new laws targeting deepfake porn, these cases are often difficult to prosecute. Just finding the perpetrators can be a difficult task, requiring considerable manhours, financial resources, and expensive technology. Plus, many laws require a prosecutor to prove the perpetrator intended some harm to the depicted person, when often the perpetrator's only objective is self-gratification.

Child Pornography Laws and Deepfake Images of Minors

Federal law and all 50 states make it illegal to create, sell, distribute, or possess child pornography that depicts actual minors engaging in sexual conduct. Many of these laws extend to images created, adapted, or modified to depict an identifiable minor, which arguably includes deepfakes and AI-generated images. Internet child porn can be prosecuted at the federal and state levels.

Federal Law Prohibiting Deepfake Child Pornography

Federal law prohibits the possession, distribution, and creation of child pornography. The definition of child pornography includes any visual depiction of a minor engaging in sexually explicit conduct, including computer-generated images that are indistinguishable from an actual minor or that have been created, adapted, or modified to depict an identifiable, actual minor.

Penalties for federal child pornography convictions are harsh, carrying prison sentences that range from 5 to 40 years, plus stiff fines. Mandatory minimum sentences may apply, and having prior child pornography or child sexual abuse convictions results in increased penalties.

(18 U.S.C. §§ 2252, 2252A, 2256(8) (2024).).

State Laws Prohibiting Deepfake Child Pornography

State laws also prohibit creating, sharing, and possessing child pornography. Many of these laws mirror the federal law and include visual depictions of an identifiable child that has been created, adapted, or modified using AI or other computer software.

For example, Virginia law defines "child pornography" as "sexually explicit visual material involving an identifiable minor, including visual depictions that are adapted or modified." Minnesota's laws prohibiting possession and creation of child pornography specifically include a "computer-generated image or picture… [that] has been created, adapted, or modified to appear that an identifiable minor is engaging in sexual conduct." Florida recently amended its child pornography law to include any image that has been created, adapted, or modified by electronic or other means to portray an identifiable minor engaged in sexual conduct. (Fla. Stat. § 827.071; Minn. Stat. §§ 617.246, 617.247; Va. Code §§ 18.2-374.1, 18.2-374.1:1 (2024).)

Some states' laws may not be broad enough to cover new technologies, and lawmakers have been rushing to push through legislation with new or amended language. Other states are doubling down to ensure that their laws prohibit deepfake child pornography. Texas law, for example, defines child pornography to include visual depictions. But lawmakers decided to go a step further and enacted language clarifying that the law includes the use of recognizable images of an actual child produced using AI technology or other computer software. (Tex. Penal Code § 43.26 (2024); Tex. H.B. 2700 (2023).)

Like federal law, state penalties for child pornography often include lengthy prison sentences and stiff fines.

State Criminal Laws Prohibiting Deepfake Pornography

Several states have jumped in quickly to create new laws prohibiting the creation, possession, or distribution of deepfake images depicting nudity or sexual conduct. The laws vary considerably, especially regarding proof of a defendant's intent. Many require the prosecutor to establish that the defendant intended to harm or harass the victim in some way, which can be difficult to prove. State lawmakers are also grappling with defining "deepfake" images so the laws cover ever-advancing technology. For instance, language that targets "altered or modified images" might not cover AI technology that can "create" deepfake images.

Below are state laws that may criminalize creating or sharing deepfake pornography. Many state legislatures are working to update their laws to more effectively combat the growing problem of deepfake porn.

Florida makes it a third-degree felony to willfully and maliciously publish, post, or share an altered sexual depiction of an identifiable person without consent. An "altered sexual depiction" is any visual depiction that's modified, altered, or adapted to represent a realistic version of an identifiable person. (Fla. Stat. § 836.13 (2024).)

Georgia law makes it a crime to electronically transmit or post an image or video depicting nudity or sexually explicit conduct of an adult, including a "falsely created videographic or still image," when the defendant intends to harass or cause financial loss to the depicted person. Posting the video or image to a website or file server carries felony penalties. (Ga. Code § 16-11-90 (2024).)

Hawaii considers it a first-degree privacy invasion to create, disclose, or threaten to disclose fictitious, composite images depicting a person engaging in sexual conduct. To be convicted, the prosecutor must prove the defendant intended to substantially harm the depicted person's health, safety, career, business, financial condition, or reputation or acted out of revenge or retribution. This offense is a class C felony. It's also a crime to possess images that violate this law. (Haw. Rev. Stat. § 711-1110.9 (2024).)

Indiana will make it a crime to distribute or post, without consent, "intimate images" depicting sexual conduct or nudity. Intimate images include computer-generated images created using AI or a computer program that appears to depict the alleged victim. (Ind. Code § 35-45-4-8 (effective July 1, 2024).)

Louisiana's law on "Unlawful Deepfakes" makes it a felony to:

  • knowingly create, possess, sell, or distribute deepfake material depicting a minor engaging in sexual conduct, or
  • knowingly sell, distribute, or exhibit deepfake material of an adult engaged in sexual conduct without their consent.

Penalties range from 5- to 30-year prison sentences. For crimes involving minors, a mandatory 5- or 10-year sentence applies. (La. Rev. Code § 14:73.13 (2024).)

Minnesota makes it a crime to intentionally and without consent distribute a deepfake image that realistically depicts an identifiable person's intimate parts or an identifiable person engaging in a sexual act. A violation is a gross misdemeanor. The penalty increases to a felony if the defendant:

  • distributes the deepfake intending to profit from it
  • posts the deepfake on a website
  • shares the deepfake to harass the depicted individual, or
  • causes the depicted individual to suffer financial harm.

Repeat offenses are also felonies. (Minn. Stat. § 617.262 (2024).)

New York targets deepfake porn through its revenge porn laws. The state expanded its current law to prohibit the nonconsensual distribution of a sexually explicit image of another, including images created or altered by digitization. For a conviction, the prosecutor must prove the defendant intended to harm the emotional, financial, or physical welfare of the depicted person. (N.Y. Penal Code § 245.15 (2024).)

North Carolina has misdemeanor and felony penalties for unlawful disclosure of private images, including digital and computer-generated images. Unlawful disclosure occurs when the person knowingly publishes or distributes nude or sexually explicit images of an identifiable person without the affirmative consent of that person. Criminal penalties apply if the defendant disclosed the images with the intent of coercing, harassing, intimidating, demeaning, humiliating, or causing financial loss to the depicted person or to cause others to do so. (N.C. Gen. Stat. § 14-190.5A (2024).)

South Dakota makes it a crime to knowingly sell or share, without consent, any image or recording of a person that's been manipulated to create a realistic but false image depicting the person as nude or engaging in a sexual act. To secure a conviction, the prosecution need only establish that the defendant's intent was self-gratification. A violation is a class 1 misdemeanor. However, felony penalties apply if the victim is younger than 18 and the defendant was 21 or older when the recording was made. (S.D. Codified Laws § 22-21-4 (2024).)

Texas. A person commits a class A misdemeanor in Texas by unlawfully creating or distributing a deep fake video that appears to depict a person engaged in sexual conduct or with intimate parts exposed. (Tex. Penal Code § 21.165 (2024).)

Utah law prohibits the unlawful distribution of a counterfeit intimate image, defined as any visual depiction or computer-generate image created, edited, manipulated, or altered to depict the likeness of an identifiable individual. A person commits an offense by knowingly distributing a counterfeit image without the depicted person's knowledge and knowing it will cause emotional harm or distress to the depicted person. Felony and misdemeanor penalties apply. (Utah Code § 76-5b-205 (2024).)

Virginia expanded its revenge porn law to include nude or partially nude images "created by any means whatsoever" and distributed without authorization. The law applies only if the person maliciously shares or sells the image with the intent to coerce, harass, or intimidate the depicted person. (Va. Code § 18.2-386.2 (2024).)

Civil Remedies for Deepfake Pornography

If criminal remedies don't apply, or even if they do, it's possible a victim could seek damages in a civil lawsuit. Possible civil causes of action include invasion of privacy, intentional infliction of emotional distress, or actions specific to nonconsensual distribution of deepfake porn.

Several states—including California, Florida, Illinois, Minnesota, and South Dakota—have laws that allow victims whose images have been unlawfully used in deepfake porn to seek money damages and ask for court orders directing defendants to take down materials from websites and destroy all copies.

(Cal. Civil Code § 1708.85, Fla. Stat. § 836.13, 740 Ill. Comp. Stat. § 190/5 and following; Minn. Stat. § 604.32; S.D. Codified Laws § 22-21-4 (2024).)

Talk to a Lawyer

What law applies (and if it does) depends on several factors, such as how the images were created, obtained, or distributed, what the defendant did with the images, what the defendant's intent was in making, sharing, or possessing the images, and how old the victim in the image was.

States haven't taken a uniform approach to this issue, and the laws and penalties regarding deepfake porn vary considerably. If you have questions regarding crimes associated with deepfake pornography, talk to a criminal defense lawyer. For questions regarding civil remedies (money damages, removal options), a personal injury lawyer may be able to assist you.

Get Professional Help

Talk to a Sex Crime attorney.

How It Works

  1. Briefly tell us about your case
  2. Provide your contact information
  3. Choose attorneys to contact you

Talk to a Lawyer

Need a lawyer? Start here.

How it Works

  1. Briefly tell us about your case
  2. Provide your contact information
  3. Choose attorneys to contact you