Legal Scholars Dive into Implications of Deep Fakes
鈥淚magine the night before an IPO, a deep fake video of the CEO comes out of the CEO soliciting a child prostitute or doing drugs,鈥 professor and privacy expert Danielle Citron, JD, said to a full house in the school鈥檚 Ceremonial Moot Courtroom.
鈥淭here goes the IPO, and the faith of the marketplace for the CEO is wrecked,鈥 she continued.
Citron was the keynote speaker at the 2019 spring symposium,
"Privacy Implications of Deep Fakes" panelists at the Maryland Law Review Spring 2019 Symposium (bottom row l-r) Suzanne Dunn, University of Ottawa; Mary Anne Franks, University of Miami School of Law; Ari Waldman, New York Law School. (Top row l-r) Danielle Citron, Maryland Carey Law; Woodrow Hartzog, Northeastern University School of Law; Jessica Silbey, Northeastern University School of Law.
If you鈥檝e never heard of a deep fake, you will, said Citron, who noted that 鈥渨e are in a moment of pervasive disinformation.鈥
()
For the uninitiated, the term deep fake refers to the digital manipulation of audio, images, or video to make it appear that a person did or said something they didn鈥檛 say or do in a very realistic way. The best deep fakes are undetectable and therefore hard to debunk. Well-known examples feature celebrities such as Gal Gadot and Emma Watson inserted into deep fake pornographic videos.
The latest technology called Generative Adversarial Networks (GAN) adds a frightening level of sophistication by using machine learning techniques that are incredibly hard to detect.
The 鈥淒eep Fake鈥 symposium was inspired in part by a forthcoming article in California Law Review co-authored by Citron and Robert Chesney, JD, a professor at the University of Texas School of Law. The article provides a prescient first assessment of the causes and consequences of deep fake technology.
The event attracted some of the best minds in the legal and technology communities to participate in spirited panel discussions, including 鈥淭he Privacy Implications of Deep Fakes鈥 which featured Woodrow Hartzog, JD, and Jessica Silbey, JD, both of Northeastern University School of Law; Mary Anne Franks, JD, DPhil, MPhil, University of Miami School of Law; Ari Waldman, JD, PhD, New York Law School; and Suzanne Dunn, JD, PhD, University of Ottawa.
Panelists for 鈥淭he Role of Intellectual Property, Platforms and Free Expression Concerns, and National Security Implications鈥 discussion included Stacey Dogan, JD, Boston University School of Law; Olivier Sylvain, JD, PhD, Fordham University School of Law; Kate Klonick, JD, PhD, St. John鈥檚 University Law School; and Thomas Kadri, JD, MA, Yale Law School. The panel was moderated by Carey Law professor David Gray, JD, PhD.
The third panel tackled the national security implications of deep fakes and featured Benjamin Wittes, a senior fellow at the Brookings Institution; Quinta Jurecic, managing editor, Lawfare; and Alan Rozenshtein, JD, University of Minnesota Law School.
The goal of the event, according to Citron, was to talk about the harm that deep fakes impose on individuals and society and then to 鈥減uzzle through together the modest way that law can intervene.鈥
And just how can the law intervene? 鈥淭he law is a modest and blunt tool,鈥 Citron admitted, 鈥渂ut we have to try.鈥
She ticked off areas of law that could be invoked in deep fake cases, including criminal statutes related to impersonation of government officials and fraud statutes. Citron and Chesney embrace the option of changing section 230 of the Communications Decency Act to make immunity conditional for internet platforms. Section 230 currently 鈥減rovides immunity from liability for providers and users of an 鈥榠nteractive computer service鈥 who publish information provided by third-party users,鈥 according to the legislation.
鈥淩ight now, it鈥檚 a free pass. It provides no incentives for platforms to protect the vulnerable,鈥 said Citron, who added there are platforms whose business model is based on abuse and destruction. 鈥淭hey make money off of eyeballs. They make money when stuff goes viral.鈥
鈥淎t the end of the day, Bobby and I don鈥檛 have clear answers,鈥 said Citron, referring to Chesney, her deep fake paper co-author. 鈥淭hat鈥檚 why we wanted to bring the smartest people together in one room to talk about the privacy, the free speech, the IP, the national security implications of these images and video and audio.
鈥淚 think the lesson is the law moves like a pendulum,鈥 she said, swinging her arm back and forth. 鈥淲e overreact, we underreact, and hopefully we end up somewhere in the middle.鈥