You're Only Mostly Dead: Protecting Your Digital Ghost from Unauthorized Resurrection.

AuthorRoberts, Rebecca J.

TABLE OF CONTENTS I. INTRODUCTION 274 II. BACKGROUND 276 A. Artificial Intelligence Capabilities Have Advanced to Producing Lifelike Synthetic Media, like Digital Cloning 276 B. Digital Cloning Is Not Limited to the Living 278 C. Current Law Does Not Provide Adequate Support Against the Unauthorized Creation of Digital Clones 282 1. Privacy Law 282 2. Trademark and Copyright Law 284 3. Criminal Law 286 III. ANALYSI 287 A. Lackluster Solutions to Curb Deepfakes and Digital Cloning 288 B. The Solution to Unauthorized Post-Modern Digital Cloning Uses the Legal Mechanisms Controlling Property Through Probate Law 290 C. Digital Assets Are Already Included in Existing Probate Law 291 D. The Media Used to Create Digital Clones Should Be Considered Digital Assets, and RUFADAA Should Be Expanded to Protect Against Their Unauthorized Use 293 IV. CONCLUSION 296 I. INTRODUCTION

"You married the most, most, most, most, most genius man in the whole world, Kanye West," said the Robert Kardashian hologram custom ordered by Kanye West. (1) In 2020, a production company holographically resurrected the deceased Robert Kardashian using artificial intelligence. (2) This lifelike hologram was programmed to say and do things that the real Robert Kardashian never said or did while still alive--including high praise of his daughter's then-husband, Kanye West, who purchased the hologram for his then-wife's birthday. (3)

Artificial intelligence ("AI") is a constantly evolving field that plays a substantial role in the manufacture of synthetic media. (4) As AI technology improves and expands, advanced synthetic media known as "digital clones" and "deepfakes" have started to emerge. (5) This synthetic media is created using photos, videos, and audio of a person, which can then be programmed to do and say anything the programmer wishes. (6) They manifest as chatbots, audio clips, videos, holograms, and other varieties of audio-visual media. (7) Production of these digital clones varies from glitchy videos that individuals can create for free on an easily accessible app, to highly expensive holograms like that of Robert Kardashian. (8) These digital clones in some cases are so incredibly lifelike that they seem real--tricking viewers into believing they are seeing something truly authentic--when they are actually just AI-created synthetic media. (9)

Due to the high volume of digital media created during one's lifetime, (10) digital clones can be produced post-mortem. (11) Digital cloning technology allows for the creation of holograms, audio messages, videos, etc. of a dead person doing or saying something they never said or did while still alive. (12) This type of technology can be useful in the world of entertainment, for example, as it provides opportunities to reanimate actors who passed before their film finished shooting. (13) However, synthetic media also presents several ethical concerns. After someone dies, a video could emerge of their digital clone saying something deplorable going against everything they believed in while still alive. If such synthetic media is truly indistinguishable from authentic media, a person's voice, life, and legacy is put at risk, and there is nothing that can be done because they are no longer alive to refute it.

Through the years, courts have consistently held that people have no personal rights after death (14) and that reputation and dignity are not maintained after death. (15) While some states have post-mortem privacy laws protecting against the commercial use of a deceased celebrity's likeness, (16) this would not protect private figures from unauthorized digital clone creation and use, nor would it protect against noncommercial unauthorized creation and use. Because current legislation and common law are inconsistent and almost entirely hypothetical, and because they do not go further than protecting certain situations in which post-mortem digital clones may be created and used, this issue requires a novel approach. (17) Through probate law and estate planning, the deceased have an atypical right to control how their property is distributed and used. (18) This Note will argue that there should be an explicit safeguard within probate law protecting against the unauthorized creation and use of a deceased person's digital clone.

The Background section will explain how artificial intelligence has enabled the production of synthetic media depicting real people. There are some ethical and legal concerns that arise from both existing and impending post-mortem synthetic technology. This section will also assess untested solutions that could potentially protect against digital cloning and synthetic media in different fields of the law. Post-mortem privacy rights are only extended to celebrities under existing privacy law. (19) Although there may be copyrightable and trademarkable elements within the field of artificial intelligence, there are no proven or guaranteed protections against unauthorized digital cloning. (20) Criminal law is beginning to prohibit certain aspects of deepfake technology, but such laws do not prohibit unauthorized use unless there is a severe and tangible harm. (21)

The Analysis will compare the benefits and potential harms that could come with the growing prevalence of post-mortem digital cloning technology, as well as discuss the successes and failures of attempted claims against it. While there are some possible solutions for victims of unauthorized digital cloning, legislators have not been able to keep up with the growing prevalence of this technology, and there are several gaps in protection. Further, post-mortem rights are practically non-existent in every field of law except probate law. Current standards within probate law regarding digital assets and digital estate planning do not currently include specific protections against post-mortem digital cloning, but they could be extended to do so. The final section of the Analysis will present estate planning and probate law as an innovative way to preempt unauthorized post-mortem digital clones. Requiring explicit, affirmative permission from a decedent is the best way to successfully protect a deceased person's estate from the unauthorized creation and use of post-mortem digital clones.

  1. BACKGROUND

    1. Artificial Intelligence Capabilities Have Advanced to Producing Lifelike Synthetic Media, like Digital Cloning

      The term "artificial intelligence" was first used in the 1950s in an effort to describe the process of teaching computers to understand and recreate human reasoning. (22) After many years of development, AI seems to have a hand in so much of society's day-to-day life--from vehicles, to phones, to Google Home hubs. (23) While there are certainly a wide variety of benefits attributable to the prevalence of AI, its fast growing adaptation also presents a series of concerns for the future. (24) AI uses algorithmic technology to learn our routines and interests, which allows for personalized advertising and lifestyle convenience. (25) However, with such access to personal data, there are concerns about privacy and how daily interactions with AI might be used. (26) Further, as AI capabilities increase, there is concern that in the wrong hands the technology may be used in more malicious ways. (27)

      Synthetic media is content created through the use of AI--equipping algorithmic deep learning technology to create incredibly lifelike artificial media. (28) This technology can modify or manipulate currently existing photos and videos of a person by superimposing them onto other existing media--creating what is colloquially known as a "deepfake" or "digital clone." (29) By exchanging aspects of existing media with other existing media, a person can create hyper-realistic media depicting something that does not actually exist. (30) Popular deepfake media shows politicians, celebrities, and even private citizens doing or saying something they have never done or said. (31) Similarly, there also exists AI technology that takes existing audio clips of a person and programs software to recreate that person's voice saying anything they want. (32) Throughout this Note, the terms "synthetic media," "deepfakes," and "digital clones" will be interchangeably used to refer to any kind of AI-generated media mimicking a real person that has been created using the person's preexisting media outputs.

      People's lives and reputations are at stake now that there is such potentially deceptive technology out there that could leave the public with a false impression of someone's behavior. (33) Political figures could equip deepfake technology to present opposing parties doing or saying something that is not congruent with their true political or moral standpoints. (34) Courts have also recently become aware that a more robust system of authentication may be needed for certain pieces of evidence in order to admit them as reliable. (35) Audio files, photos, and videos can no longer be taken at face value. (36)

      State legislators have recently begun analyzing these arising issues and enacting new legislation to regulate the effects of deepfakes and artificial intelligence, primarily related to election interference and pornography. (37) Congress also recently voted to require that the Department of Homeland Security issue annual reports for the next five years on potential harms that may arise from the increasing use of deepfake technology. (38) In 2021, the Department released an infographic detailing possible threats and scenarios that could arise from such synthetic media. (39) Even though the concept of AI has been around since the 1950's, and has a prevalent role in everyday life, there is still very little legislative or judicial guidance on how to protect the public from the number of harms it could potentially bring about.

    2. Digital Cloning Is Not Limited to the Living

      Films, television shows, and books have predicted the idea of...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT