Enlarge
(credit:
Getty Images
)
On Tuesday, the UK government
announced
a new law targeting the creation of AI-generated sexually explicit deepfake images. Under the legislation, which has not yet been passed, offenders would face prosecution and an unlimited fine, even if they do not widely share the images but create them with the intent to distress the victim. The government positions the law as part of a broader effort to enhance legal protections for women.
Over the past decade, the rise of deep learning image synthesis technology has made it increasingly easy for people with a consumer PC to create misleading pornography by swapping out the faces of the performers with someone else who has not consented to the act. That practice spawned the term "deepfake" around 2017, named after a Reddit user named "deepfakes" that
shared AI-faked porn
on the service. Since then, the term has grown to encompass completely new images and video
synthesized entirely from scratch,
created from neural networks that have been trained on images of the victim.
The problem isn't unique to the UK. In March, deepfake nudes of female middle school classmates in Florida
led to charges
against two boys ages 13 and 14. The rise of open source image synthesis models like
Stable Diffusion
since 2022 has increased the urgency
among regulators in the US
to attempt to contain (or at least punish) the act of creating non-consensual deepfakes. The UK government is on a similar mission.