A Privacy Torts Solution to Postmortem Deepfakes.

AuthorWall, Olivia

Introduction

In 2021, Road Runner, a documentary about late celebrity chef Anthony Bourdain, became embroiled in controversy for using Al-generated voice technology to create a voiceover of Bourdain reading an email which he wrote but never spoke aloud. (1) In response to the director's claim of having received a blessing from Bourdain's loved ones to do this, Bourdain's exwife tweeted, "I certainly was NOT the one who said Tony would have been cool with that." (2) The Road Runner controversy highlights the murky ethics of postmortem deepfakes. There is the question of whether the Bourdain deepfake caused any harm, and if it did, whether it harmed the deceased chef or his surviving ex-wife. Bourdain may have never said the words aloud, but they were his words. However, Bourdain's ex-wife, was still upset on Bourdain's behalf because he did not consent to the deepfake. At its core, the question is what care do the living owe to the dead in a technological age where deepfakes can depict the dead doing or saying things they never did?

The answer to this question cannot be found in current privacy law, which does not provide redress for dignitary harms caused by deepfakes of the deceased. As it stands, deepfakes are primarily regulated by state laws through the right of publicity. (3) However, because the right of publicity only concerns commercial injury, redress is limited to persons who can demonstrate commercial value for their image or likeness (i.e., celebrities). At the same time, deepfakes have become increasingly real-looking, (4) easy to create, (5) and shareable to the entire world for free via social media. (6) The danger of deepfakes has crept into the lives of average persons, yet the law unfairly limits legal redress to celebrities.

While the law could provide redress for mental and emotional harms caused by deepfakes, privacy torts exclude the deceased. Whereas the right of publicity is based in property rights, which makes the right descendible after death, privacy torts are based in mental or emotional harms, which do not survive death. (7) What's more, these mental and emotional injuries do not include a conception of human dignity, which goes to the heart of the injury caused by postmortem deepfakes--a violation of identity, personhood, and control over one's legacy. (8) This Note addresses possible solutions to the novel issue of redressability for postmortem deepfakes based on existing privacy torts, arguing a new exception is needed to protect the memory of the deceased, on behalf of both the living and the dead.

  1. The Problem of Postmortem Deepfakes

    In December 2017, popular tech blog Motherboard detailed a new phenomenon sweeping Reddit: deepfakes. (9) An eponymous Redditor (10) posted a pornographic video with Gal Gadot's face superimposed on the performer's body. (11) Soon, the subreddit r/deepfakes was created, amassing 90,000 followers (12) and hosting similar pornographic videos with face-swapped female celebrities such as Taylor Swift, Daisy Ridley, and Maisie Williams. (13) Another redditor, user deepfakeapp, even developed an online application called FakeApp to assist redditors in creating deepfake videos from personal data sets using deepfakesapp's algorithm. (14)

    While the "fake" in deepfake belies these videos' inauthenticity, the technology's purpose is to appear real. The "deep" component of deepfake references "deep learning," the machine learning phenomenon wherein computers utilize layered neural networks to enhance their own algorithms. (15) Through interconnected processes in the computer's architecture that mimic neural connections in the human brain, a computer is able to improve its own performance. (16) Artificial intelligence programs which produce deepfakes are called Generative Adversarial Networks (GANs). (17) Ian Goodfellow and other researchers at the University of Montreal developed GANs by using two complimentary algorithms. (18) The first algorithm is the "generator," responsible for creating content. (19) The second algorithm is the "discriminator," tasked with scrutinizing the content for authenticity. (20) If the discriminator can tell the content is fake, then the generator algorithm refines the content until it presents as sufficiently real to fool the discriminator. (21) This machine learning trains the computer to seamlessly alter each frame of a video (22) so that it looks real upon playback. (23) The end result is a deepfake: a depiction of something that never actually happened, but looks like it did.

    Indeed, rapid advancements in deepfake algorithms have resulted in deepfakes which are effectively indistinguishable from reality. A recent study found participants were able to discern real faces from fake ones at a rate no more accurate than a coin toss, and even rating GAN-generated faces higher in trustworthiness than real human faces. (24) Another study found that not only do people fail to reliably detect deepfakes, but they also show bias towards finding deepfakes to be authentic while simultaneously overestimating their ability to detect deepfakes. (25)

    While the deepfake landscape is dominated by pornography, deepfakes have invaded many other cultural contexts. (26) The popular app TikTok is a hotbed for celebrity deepfakes (27) and deepfake memes. (28) Deepfakes have also been used for dueling political purposes: sometimes spreading awareness of the dangers of deepfakes for democracy, (29) while at other times interfering with democratic elections. (30) Recently, a deepfake of Ukrainian President Volodymyr Zelenskyy telling his soldiers to surrender to invading Russian forces circulated social media. (31) Audio deepfakes have also been used to carry out two large bank heists. (32)

    Deepfake technology has even been used to resurrect the dead. MIT professors produced a deepfake depicting an alternate history wherein Richard Nixon delivered his "In the Event of Moon Disaster" speech following the failure of the 1969 Apollo moon landing. (33) One genealogy company is offering consumers a new feature called "Deep Nostalgia," which is able to animate photographs of dead relatives within seconds. (34) Public figures have used deepfake videos of deceased loved ones to create personal gifts, like one created by musician Kanye West to give to his then-wife, media personality Kim Kardashian, for her fortieth birthday. (35) Deepfakes of dead relatives have also been utilized to promote political causes. Parents of 2018 Parkland shooting victim Joaquin Oliver created a deepfake of their son urging voters to support gun control in the 2020 election: "I mean, vote for me. Because I can't." (36)

    However, deepfakes of the deceased have been employed most prominently in the entertainment industry through retroactive recreation. (37) Retroactive recreation generally depicts actors who die mid-production by superimposing old footage of the actor's face onto a body double through CGI during post-production. (38) A deepfake of James Dean, who died in 1955, stars in the upcoming film Finding Jack (39) Peter Cushing, who died in 1994, was featured via deepfake in 2019's Rogue One: A Star Wars Story (40) However, the use of deepfakes of dead actors in films remains controversial. Some film critics lambasted Disney's use of deepfake technology to feature Peter Cushing, calling it "a digital indignity" (41) and "jarringly discomfiting." (42)

    Posthumous deepfakes are becoming increasingly prevalent in not only entertainment, but in the lives of ordinary people. This new technology is becoming more difficult to distinguish from reality, thus blurring the lines between truths and falsehoods for those who no longer have a voice to defend themselves. Because one's legacy is an important part of identity and personhood, the law should be able to provide redress for injuries inflicted by postmortem deepfakes.

    Currently, a deceased individual's estate may only litigate commercial injuries through the right of publicity. Privacy law generally permits actionability for three types of injuries: commercial harm in the right of publicity, mental and emotional harm in the privacy torts, and dignitary injury in the European model. (43) There is no legal redress for postmortem mental, emotional, or dignitary injuries because only living persons experience those things. (44) Relatives are also foreclosed from suing for invasions of privacy on behalf of the deceased. (45) Even though a loved one may be humiliated, aghast, or distraught after viewing a deepfake of the deceased, they have no actionable claim. (46) Additionally, loved ones may not sue to restore the tarnished reputation of a dead relative. (47) This is called the no relational right rule, which states that privacy suits may not be asserted by proxy. (48)

    However, the no relational right rule has a few exceptions. Relatives may sue for disclosure of photos of the deceased's body due to the outrageousness of such conduct. (49) Courts have emphasized that this exemption is to protect the privacy of living relatives rather than that of the deceased. (50) Florida has also conferred statutory protection against the release of autopsy photos to non-family members without good cause. (51) Because the exceptions to the no relational right rule are extremely limited, most postmortem would-be privacy claims are creatively repackaged as right of publicity claims in order to afford relief to the deceased's relatives. (52) In the same way, postmortem deepfake litigation has been shoehorned into the right of publicity because privacy interests are unfairly limited to living individuals.

  2. A Privacy Torts Approach to Postmortem Deepfakes

    1. The Right of Publicity

      Current legal protections surrounding deepfakes largely come from the right of publicity. The right of publicity at common law protects against another's appropriation for his own benefit of one's likeness without consent. (53) The Third Restatement of Unfair...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT