If you’re at all interested in artificial intelligence, and if you’re reading this I suspect that you may be, you will probably already be aware about the dispute between OpenAI and actress Scarlett Johansson. On May 13, OpenAI revealed a new version of ChatGPT, and also announced that it would introduce a number of voice assistants, with accompanying video demonstrations of several of them conducting conversations with developers, and in one instance with each other. I found the demonstrations impressive, but the flirty tone of some of the assistants was criticised at the time. One of the first parallels drawn by people was regarding the movie ‘Her’ by Spike Jonze. Sam Altman himself fanned the comparison when he tweeted this:

‘Her’ is a film set in a near-future Los Angeles, it follows Theodore Twombly (Joaquin Phoenix), a lonely writer who develops an unlikely relationship with Samantha (voiced by Scarlett Johansson), an advanced artificial intelligence operating system designed to meet every need her user could ever want. I personally really liked the movie, and it seems like I was not alone, as apparently Altman is also a big fan. On a side note, I’ve been shocked by how many people completely misunderstood the plot of ‘Her’, Spike Jonze said that it was a love story, and too many people seem to think that it’s an anti-AI movie, when Samantha’s AI nature is not very relevant to the plot. You could replace her with an imaginary friend and it would have the same effect, it’s a movie about learning to accept and love oneself, in other words, it’s a journey of self-discovery. But I digress.

However, not everyone was happy with the comparison between Sky and Samantha. Scarlett Johansson’s legal team demanded that OpenAI reveal the development process behind the AI personal assistant voice dubbed “Sky,” which allegedly bears a striking resemblance to Johansson’s own voice. OpenAI’s CEO, Sam Altman, initially invited comparisons to Johansson’s voice with a cryptic Twitter post above, but later denied any connection. Johansson felt betrayed, as Altman had previously approached her to license her voice for the assistant, an offer she declined. She perceived the use of a similar voice without her consent as a personal affront and raised concerns about the lack of legal safeguards regarding the use of creative work in AI.

Needless to say, Johansson’s complaint hit the Internet like a storm, and it would be fair to say that at least in my circles most people sided with Johansson, and the incident initially seemed to have been a PR disaster for OpenAI, the narrative quickly revolved around tech companies stealing people’s voices. Besides the seemingly negative ethical implications, lots of people also commented that if Johansson would decide to sue, she could have a strong case based on US publicity rights. Many argued that the case closely resembled two famous cases, Midler v Ford, and Waits v Frito Lays.

In 1988, Ford Motor Company used a song closely associated with Bette Midler in a commercial. They hired a sound-alike singer to imitate Midler’s voice, aiming to evoke her persona for the ad. Midler sued Ford for violating her right of publicity, a legal right protecting individuals from unauthorised commercial use of their identity. Although the song itself wasn’t the issue (as Ford had licensed it), Midler argued her voice was distinctive and integral to her identity, and its imitation constituted misappropriation. The courts agreed with Midler, establishing a precedent that a celebrity’s voice can be protected under the right of publicity.

In a similar case taking place in 1991, singer Tom Waits sued Frito-Lays after the company used a sound-alike in a commercial for SalsaRio Doritos, despite Waits’ refusal to participate in endorsements. Waits claimed this unauthorised use of his distinctive voice violated his publicity rights and constituted false endorsement. The jury awarded Waits $2.5 million in damages, a decision upheld by the Ninth Circuit Court of Appeals.

In both cases, the refusal by the singers played a significant part in the decisions, as the use of imitators is not forbidden as such, but both had been approached and refused to take part. It seemed clear that this incident was similar, hence the wave of approval for Johansson in both the public and legal circles.

Although my research this year has been moving towards voice cloning, I have to admit that I wasn’t perhaps as interested in this case as others, particularly because I initially bought the prevalent online narrative: OpenAI approached Scarlett Johansson, she refused, they still released the assistant regardless. This was to me too close to Midler and Waits, case closed. But I couldn’t shake a little doubt, and the timeline didn’t fit. When was Johansson approached? How is the assistant already operational? That must have taken months! I also have to admit that I hadn’t paid much attention to the actual voice of the assistant in question, and I assumed the veracity of all reports that it resembled Johansson’s voice. But then several clips started making the rounds on social media with side-by-side comparisons, and I just couldn’t hear any resemblance. To my untrained ear they sound very different, Johansson’s voice has a very characteristic raspy nature, while Sky sounded higher and more nasal.

Then OpenAI went on the defensive, arguing that Sky’s voice was developed from another actress’s voice to protect her privacy, and it emphasised that it does not deliberately mimic celebrities’ voices. And then a report by the Washington Post helped to set the record straight with documentation and interviews of the people involved. The ad for a voice actor was published a year ago, and the recordings took place in June of 2023. The voice actor was never told to do or imitate Scarlett Johansson, and her agent and documentation never mention ‘Her’ or anything related to that movie. The casting call asked for someone with a “warm, engaging [and] charismatic” voice. Evidence presented in the article continues to point towards the voice having been chosen without Johansson in mind.

So from a legal perspective, this case could be less straightforward than previously thought, and it may contain important differences to the Midler and Waits cases. As the article says, it all may rest on whether a jury would find both voices to be similar. I personally think that this could go either way, but at least to my ear, they don’t sound the same. The remaining sticking point is Altman’s tweet, and the fact that Johansson was indeed approached, apparently to try to get her to voice a different assistant.

This could also have a different result elsewhere, for example in the UK there are no publicity rights to one’s voice, so this would have to be handled by other avenues, such as the tort of passing off. In this particular case, the fact that OpenAI wasn’t advertising this as being Scarlett Johansson would play a big part, and I don’t think there would be a case. So perhaps there’s room for reform? Voice has little to no protection in the UK.

This could have been a fascinating case, but alas, we may never get to see a lawsuit. Given the strength of the public blowback, OpenAI announced that it would not be deploying ‘Sky’ to the public respecting Johansson’s feelings on the matter.

Anyway, I’m glad that I have a horrible voice and nobody will ever want to train an AI using my voice. Oh wait


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.