Abstract
This Note explores the distribution of deepfake pornography and the complicated legal web it has created – pitting internet free speech, existing law, and human dignity against each other. Part I provides a brief overview of the machine-learning technology used to create deepfake pornography. Part II discusses the current legal environment that protects the distribution of deepfake pornography. Particularly, Part II explains the relationship between free speech, pornography, obscenity, Section 230, and lack of federal regulations in the context of deepfake pornography. Part III explores the impact that distribution of deepfake pornography has on victims. Part IV presents the argument that it is the distribution of deepfake pornography that causes harm and analyzes why existing law does little to prevent distribution. Part IV offers the solution that ISPs, as the mechanisms for distribution, should held liable for the distribution of deepfake pornography. Part IV advocates for a combined solution of a federal criminal law prohibiting the distribution of deepfakes and an amendment to Section 230 to exclude immunity for ISPs who knowingly distribute deepfake pornography. Such a solution eradicates the distribution of deepfake pornography without censoring lawful and socially valuable deepfakes, such as parodies and satire.
How to Cite:
Shelby Akerley, Let's Talk About (Fake) Sex Baby: A Deep Dive Into the Distributive Harms of Deepfake Pornography, 4 Ariz. L. J. Emerging Tech., no. 6, 2021, https://doi.org/10.2458/azlawjet.5505
Downloads:
Download PDF