By now, we are all familiar with the term “deepfake.” If you are not, the Oxford Dictionary of the English Language defines “deepfake” as a “video that has been digitally manipulated to replace one person’s likeness convincingly with that of another, often used maliciously to show someone doing something that he or she did not do.” 

Over a year ago, we wrote about the dangers of deepfake information as it pertains to social media and some legitimate media. At that time, we believed that the best defense against deepfakes was “digital literacy” and “good old skepticism” about the sources you are relying on. This is still true. Yet, the technology has improved rapidly, and while good old skepticism about your sources may be a good defense, you can be forgiven if you are tricked into believing something is real when it is actually fake.

The entertainment industry is particularly frightened of the consequences of deepfake technology. Examples abound. For example, in 2021, a Tom Cruise impersonator and a Belgian digital AI artist generated realistic videos of “Tom Cruise” partaking in “quirky” activities. A deepfake of Harry Styles singing went viral on TikTok the same year. These instances may be benign, but truth be told, 95% of deepfake videos are pornographic in nature, sometimes in the likeness of a celebrity. Take for example the deepfake scandal involving Taylor Swift earlier this year. The Guardian reported: 

For almost a whole day last week, deepfake pornographic images of Taylor Swift rapidly spread through X. The social media platform, formerly Twitter, was so slow to react that one image racked up 47m views before it was taken down. It was largely Swift’s fans who mobilised and mass-reported the images, and there was a sense of public anger, with even the White House calling it “alarming”. X eventually removed the images and blocked searches to the pop star’s name on Sunday evening. 

The problem in Korea is so bad that lawmakers recently referred to it as a “national emergency.” Rightly so, the United Nations declared “image-based sexual abuse” as violence against women and girls. But what are governments, in particular our government, doing about it?

There is no comprehensive federal regulation in the United States of deepfake technology although it’s been on Congress’ radar with a number of proposed laws, including the Deepfake Report Act of 2019, the DEEPFAKES Accountability Act, and the Protecting Consumers from Deceptive AI Act. Most recently, the Senate is considering a proposed bill referred to as the NO FAKES Act. More about that later.

Several states including Texas, Florida, Louisiana, South Dakota, New Mexico, Indiana, Washington, Tennessee, Oregon, and Mississippi have also pursued regulation of deepfakes. References to the legislation may be found here.

As mentioned, the US Senate is considering a new bipartisan bill on digital replicas, known as the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024” or the “NO FAKES Act of 2024.”  Sens. Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC) sponsored the bill. Let us take a brief look at this legislation, which has huge support from the film and recording industry.

First, it’s worth looking at how the bill defines “digital replica”: 

The term ”digital replica” means a newly-created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual that (A) is embodied in a sound recording, image, audiovisual work, including an audio-visual work that does not have any accompanying sounds, or transmission- (i) in which the actual individual did not actually perform or appear; or (ii) that is a version of a sound recording, image, or audiovisual work in which the actual individual did perform or appear, in which the fundamental character of the performance or appearance has been materially altered; and does not include the electronic reproduction, use of a sample of one sound recording or audiovisual work into another, remixing, mastering, or digital remastering of a sound recording or audiovisual work authorized by the copyright holder.

In other words, the bill covers any unauthorized video or audio, altered or not, that falsely depicts an individual or her voice. The bill, in effect, creates a federal right to control one’s likeness. While the NO FAKES Act prohibits individuals from assigning the digital rights in their likeness (i.e, giving them away), the bill permits legitimate use of deepfake technology by permitting an individual to license those rights. Upon an individual’s death the rights are transferable to the deceased’s estate but expire at the end of the end of a 10-year period, extendable for another 5 years.

Interestingly, the NO FAKES Act also provides “fair use” exceptions much like the Copyright Act. Thus, digital replicas may be used without authorization for news reporting, documentaries, commentary, or “fleeting or negligible usage.” There is also a “take down” system in which “online service” providers may escape liability if they remove access to unauthorized digital replicas upon receipt of notice.

Finally, the bill creates a safe harbor for online service providers by shielding them from secondary liability arising from unauthorized deepfakes created by users on their systems – much like the Digital Millennium Copyright Act (DMCA) insulates internet providers. One commentator has expressed concern over this aspect of the bill because it doesn’t impose even minimal requirements on service providers to shield them from liability:

it’s disappointing that the drafters haven’t imposed at least minimal requirements to enjoy the safe harbor, i.e. that companies deploy measures to prevent the generation of digital replicas for whom the companies receive notices as well as individuals listed on the registry that the Copyright Office will maintain.

The bill was formally introduced in late July, so it will be interesting to see how it changes during the legislative process. If you are interested in reviewing the text of the bill, here is a link for your enjoyment. We will continue to follow its progress and report on news about this important legislation.

— Adam G. Garson, Esq.