UK Criminalizes Creation of Explicit Deepfake Pornography Without Consent

The UK government plans to criminalize the creation of non-consensual sexually explicit deepfake images, a significant step in protecting individuals from this harmful technology's growing threat.

author-image
Trim Correspondents
New Update
UK Criminalizes Creation of Explicit Deepfake Pornography Without Consent

UK Criminalizes Creation of Explicit Deepfake Pornography Without Consent

The United Kingdom government has announced plans to make the creation of sexually explicit deepfake images and videos without consent a criminal offense. Under the proposed amendment to the Criminal Justice Bill, anyone who creates such content will face an unlimited fine and a criminal record, even if they do not intend to share the material.

The move comes as part of the government's efforts to address the growing problem of deepfake pornography, which has been described as a "gross violation of autonomy and privacy" that can cause "enormous harm." The new law will apply to adults, as creating deepfake sexual images of minors is already a crime.

Deepfake technology, which uses artificial intelligence to create hyper-realistic fake images and videos, has become increasingly prevalent in recent years. Celebrities like Taylor Swift and Cathy Newman have been targeted by deepfake pornography, with the latter describing the experience as "violating" and "incredibly invasive."

Why this matters: The criminalization of non-consensual deepfake pornography is a significant step in protecting individuals, particularly women, from the harmful effects of this technology. As deepfakes become more sophisticated and accessible, it is vital for governments to establish legal frameworks to deter their creation and distribution .

The announcement has been welcomed by politicians, media figures, and victims of deepfake footage. The Shadow Home Secretary stated that creating deepfake pornography is a serious invasion of privacy and autonomy that should never be allowed. The law is also part of the government's wider efforts to address violence against women and girls, with the Minister for Victims and Safeguarding emphasizing that the creation of such images is another way people "seek to degrade and dehumanize others — especially women."

While the law has been praised as a necessary measure to combat the growing threat of deepfake abuse, some concerns have been raised about its enforcement. Experts have warned that the police may lack the capacity, training, and resources to effectively address this issue, and perpetrators can easily share the content online without the victim's consent.

The UK government's decision to criminalize the creation of non-consensual sexually explicit deepfakes sends a clear message that this activity is unacceptable and often misogynistic. As Laura Farris, the Conservative MP who proposed the amendment, stated, "The creation of deepfake images is another way in which women's consent is disregarded, and their privacy and autonomy are violated." The new law aims to protect individuals from the harmful effects of this technology and hold those who create such content accountable for their actions.

Key Takeaways

  • UK to criminalize creation of non-consensual deepfake porn, with unlimited fines.
  • Law aims to address growing problem of deepfake pornography, a "gross violation of privacy".
  • Deepfake technology used to create fake images/videos of celebrities like Taylor Swift.
  • Law welcomed as necessary to combat deepfake abuse, but concerns raised about enforcement.
  • UK government sees this as a way to address violence against women and girls.