Weather     Live Markets

Actress Scarlett Johansson could potentially sue OpenAI over the chatbot voice Sky, which she claims sounds eerily similar to her own voice. This legal issue highlights the growing problem of technology making voice cloning easier, leading to potential legal implications for anyone. Experts point out that similar cases have occurred in the past, such as voice impersonation lawsuits brought by singers Bette Midler and Tom Waits. Johansson could potentially bring claims against OpenAI under existing laws including copyright, right to publicity, and false endorsement, with California having particularly strong protections in this area.

The advancements in technology allowing for accurate voice replication have exacerbated the issue of impersonation, with anyone potentially being targeted for voice cloning using AI. Law professor Tiffany Li warns that the problem extends beyond celebrities and public figures, citing examples of AI phone scams that clone people’s voices. The need for updated laws to address these issues and ensure individuals’ rights are protected is crucial, with laws currently being piecemeal or state-specific.

After concerns were raised over the chatbot voice Sky sounding similar to Johansson, ChatGPT pulled the voice from their platform. Johansson expressed shock over the AI voice assistant, claiming that even close friends believed it was her voice. OpenAI acknowledged the vocal similarities but emphasized that the voice belonged to a different professional actress using her own natural speaking voice. The situation has sparked a debate over the legal implications and the need for clearer laws regarding the use of an individual’s voice and likeness.

While the outcome of a potential lawsuit against OpenAI is uncertain, experts believe that Johansson may have a strong case based on right to publicity claims. OpenAI’s attempts to hire Johansson and references to the movie “Her” indicate a possible connection between the chatbot voice and the actress. The case raises important questions about protecting individuals from imitation in the era of advanced AI technology. Law professor Mark Lemley believes existing laws may be sufficient to address such cases, but Congress is considering a federal right of publicity to address these issues more effectively.

The legal issues surrounding voice cloning in AI are just the beginning of a complex and evolving landscape. The potential for impersonation and misuse of personal information through AI technology poses significant challenges for individuals and the legal system. While high-profile cases like Johansson’s may receive more attention, the broader implications for privacy and consent are relevant to everyone. The need for clear laws and regulations to protect individuals from unauthorized use of their voices and likeness is crucial in the digital age.

Share.
Exit mobile version