In India, celebrities such as Rajnikanth, Anil Kapoor and Jackie shroff have approached courts over personality rights. These are also called publicity rights. They are a part celebrity rights — the name, voice, signature, images or any other feature identified by public of a celebrity’s personality lie at the heart of personality rights. These could include poses, mannerisms or any distinct aspect of their public persona.
Celebrities at times register aspects of their personalities as trademarks to use them commercially.
The unauthorized use of these characteristics for commercial purposes not only infringes upon these rights but also dilutes the brand equity.
These rights are not defined under the Indian law, but are seen under the rights to privacy and property. Even concepts such as IRP (passing off or deception) are applied to ascertain if protection is warranted. The court can grant an injunction restraining the violators of personality rights. They cannot use tools such as AI, face morphing and GIFs for commercial purposes.
LLMs have used copyrighted data of media, and the media sues them. The Authors Guild of America filed a suit on behalf of writers such as George R R Martin and John Grishan alleging illegal use of the authors’ copyrighted works by OpenAI. There was a strike of Hollywood actors. The actors believed their faces and voices could be used to create new films and shows without their consent or without compensation. Finally, the studios assured actors that their consent would be sought.
Of late, Hollywood actor Scarlett Johansson has been shocked that OpenAI’s GPT-4o has sound ‘eerily similar’ to her voice. She has earlier declined request of OpenAI to use her voice. GPT-4o’s Voice Mode feature allows users to have voice conversations with the AI chatbot and lets them choose from five kind of voices. One of the voices, called SKY, resembles Johansson’s voice. Open AI later paused the availability of SKY. They added it is not Johannson’s voice but the voice of another actor. It was not intended to resemble hers. OpenAI’s behaviour is emblematic of the haughty feeling of impunity that governs AI industry.
AI-powered deepfake voices have emerged as a potent tool in political campaighs, corporate espionage and cyber frauds.
In its landmark judgement (Pultaswamy, 2017), SC has held to that an individual has a fundamental right to protect privacy under Article 21 of the constitution. In the Ritesh Sinha case (2019), the SC ruled that voice samples (protected under the right to privacy) can be legally compelled for criminal investigation in public interest.
A voice per se cannot get copyright protection, but an artist’s voice can be copyright protected as a performance right.
It is not clear whether voice (recorded or cloned) will be treated as digital personal data. If so, it will require explicit consent (under yet to be implemented Digital Data Protection Act, 2023).
The IT Act and its Rules do not permit misuse of illegal content. It requires labelling of synthetic content and removal of deepfake content by 24-36 hours upon receiving a report from either a user or government authority. If not complied with, there are remedies in the IT Act and the IPC.