A Resurgence of ELVIS: Addressing the Deepfake Dilemma

The article sheds light on the Ensuring Likeness, Voice, and Image Security Act of 2024 enacted by the State of Tennessee, USA and further addresses personality rights in India.
Singh & Singh - Vrinda Bagaria
Singh & Singh - Vrinda Bagaria

In March, 2024, the State of Tennessee in the United States of America enacted the Ensuring Likeness, Voice, and Image Security Act of 2024 [State of Tennessee, Public Chapter No.558, House Bill No. 2091, approved on 21.03.2024.] to amend the Tennessee State legislation relating to personality rights [Tennessee Code Annotated, Title 39, Chapter 14, Part 1 and Title 47]. This is colloquially known as the ELVIS Act. The Act aims to protect personality rights of performers from misuse of their voice and images. It will come into force on July 1, 2024 in the State of Tennessee.

With the boom that the Artificial Intelligence (AI) wave has taken globally, there has been an immense rise in creation of “soundalikes” and “deepfakes” using popular AI tools, which are easily accessible over the internet free of cost or for minimal cost.

“Soundalike” as the name suggests refers to a recording intended to imitate the voice/ style of a popular artist or sound of a recording. “Deepfake” is content where a person’s image is modified to appear as another person.

“Soundalikes” and “deepfakes” are primarily used as satire to generate viewership by content creators over various social media platforms. However, they can have huge personal, social and commercial ramifications, especially for those individuals who are the targets of such content.  This kind of content directly affects the personality rights of the individual and can even be used as tools of misrepresentation by mischievous users.

All of the above prompted the enactment of the ELVIS Act. The ELVIS Act amends and substitutes the “Personal Rights Protection Act of 1984” (PRPA). The PRPA was pertinently enacted after the death of popular artist, Elvis Presley to protect the personality rights of artists posthumously, which were not protected at the time. The abbreviation of the 2024 Act is, thus, an homage to Elvis Presley.

Under the PRPA the personality rights of an individual such as their image and likeness were protected. The ELVIS Act expands the scope of the PRPA and protects the voice of an individual as well. Thus, any person who uses another’s name, photograph, voice or likeness without their consent is now liable for infringement under the ELVIS Act.

An interesting feature of the Act is that under Section 8, it allows all persons who have been given exclusive access to an artist’s sound recordings to bring an action to enforce the rights under the ELVIS Act for any unauthorised use. Thus, record labels can bring an action for infringement under this Act as well on behalf of the artist. This ensures that sound recordings cannot be used without consent, even posthumously.

The ELVIS Act protects fair use of an artist’s image, voice and likeness, and specifically provides that use for the purpose of satire or parody as also in a fleeting or incidental manner exempts liability under the Act.

In addition to the person who uses the information, Section 6 of the Act also makes liable for infringement:

i. A person who “makes available to the public” the photograph, voice or likeness about an individual with the knowledge that such use was not authorised;

ii. A person who “makes available” any technology “the primary purpose or function” of which is the “production of a particular identifiable individual’s photograph, voice, or likeness” with the knowledge that such use was not authorised.

However, it remains to be seen what is the threshold on the basis of which knowledge can be imputed on a person. Thus, a question arises, if entities which make available any technology to create content can be made liable under the Act or not. Guidance in this respect can be taken from Section 11 of the ELVIS Act, which provides that if the owners or employees of any platform “reasonably should have known of the unauthorised use” of the individual’s photograph, voice or likeness, then they can be made liable.   

It appears from the above that the ELVIS Act seeks to make liable even platforms where the content is hosted and/ or AI service providers whose software/ technologies are being used to create the content. If it can be proved that the owners/ employees of these platforms had reasonable knowledge of the use their platforms for unauthorised use, then they cannot be exempt from liability under the legislation.

How the ELVIS Act will play out and whether it is a welcome first step to regulate fake content and misinformation spread through the use of AI-tools or if it will amount to over-regulation remains to be seen. The scope of the provisions is definitely interesting.

Personality Rights in India

In India, while there is no one specific legislation for protecting personality rights, the same is enforced under the provisions of the Trade Marks Act, 1999 and Copyright Act, 1957.

With the growing technology, there have been several incidents where celebrities have fallen prey to deepfake content. The Delhi High Court addressing the issue in Anil Kapoor v. Simply Life India & Ors., in order dated September 20, 2023, prohibited the use of Mr. Anil Kapoor’s image, voice or likeness by use of AI tools, face morphing technology, etc. for creating any videos for commercial gain.

Taking note of incidents of mass circulation of deepfake content against actors such as Ms. Rasmika Mandhana, cricket moguls such as Mr. Virat Kohli and Mr. Sachin Tendulkar and of himself , a statement was also issued by Prime Minister, Mr. Narendra Modi warning against the harm which can be caused through deepfake content.

An advisory was issued by the Central government in November 2023 to social media entities to conduct due diligence to identify the use of their platforms for misinformation and deepfakes. They were directed to remove all reported content within a period of 36 hours from receipt of such reporting. Failure to take expeditious action could result in losing the intermediary protection under Section 79 of the Information Technology Act, 2000. Additionally, the intermediary and its users could be prosecuted under the IT Act and other criminal laws for circulation of misleading content.

Again in December 2023, the Indian government issued an advisory to social media websites to caution their users against the spread of deepfake content, directing them to make users aware of what amounts to prohibited content and to report periodically on how the issue has been addressed on their platforms.

Despite the above, there was little curb on deepfake content being circulated online. Concerned with the effect of deepfake content on the public in the backdrop of the general elections, the Indian government, in addition to the advisory issued in December 2023, again on March 15, 2024 directed intermediaries and social media platforms to also adhere to the following obligations:

a. Every intermediary is to ensure that the use of generative-AI software/ algorithms does not permit hosting, displaying, modifying, publishing, transmitting, etc. of any unlawful content, as discussed in Rule 3(1)(b) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“Intermediary Guidelines”).

b. Intermediaries were directed to ensure that its platform with or without use of AI does not permit any bias/ discrimination or otherwise threaten the integrity of the electoral process

c. Specifically for intermediaries which permitted the synthetic creation of content, it was directed that any such data is labelled or embedded with metadata or an identifier to identify the intermediary from which it originated. In addition, if any changes are made to such synthetic data, the intermediary was directed to enable identification of the user/ computer resource which effected the change.

d. Failure of intermediaries to adhere to the advisory could result in penal consequences.

However, how effective the implementation of the advisories has been is not clear. Deepfake and AI-generated content is in no way curbed. This could primarily be due to the fact that the advisories were non-binding and voluntary. This is clear also from the fact that after the issuance of the advisory in March 2024, it was clarified by the government that the advisory primarily targeted major platforms, exempting AI start ups from strict adherence.

India is in no way immune to the spread of misinformation, fake content, and misrepresentation through the use of various AI tools. Where and how the liability for the same can be fixed is yet to be determined as the sector is highly unregulated. Perhaps with some regulation in place, the permissible limits of what can and cannot be circulated will become clear.

Conclusion

The ELVIS Act in the USA is a first of its kind legislation which has been passed to address the growing influence of AI-generated content and addressing regulatory concerns relating to the industry.

Since the law in this respect is largely uncodified in India, the consequences of the circulation of AI-generated content and its effects on personality rights of individuals is left to the imaginations of the Hon’ble Courts dealing with the issues. The ELVIS Act can potentially become a good reference point to build a robust system of regulation of fake content. There is a need for clarity in this sector, both for the consumers and for the social media platforms about the permissible limits of the law.  

About the author: Vrinda Bagaria is a Principal Associate at Singh & Singh Law Firm LLP.

If you would like your Deals, Columns, Press Releases to be published on Bar & Bench, please fill in the form available here.

Bar and Bench - Indian Legal news
www.barandbench.com