ChatGPT, Scarlett Johansson and losing the sense of ‘her’

One of the strangest tech stories in recent memory unfolded on Monday when OpenAI issued a statement claiming that its ChatGPT-4o AI “Sky” voice was No meant to sound like Scarlett Johansson in the movie His where it plays a conversational AI, but the company was killing it anyway.

A few hours later, this led to Scarlett Johansson herself issuing a statement to say that OpenAI had tried to hire her, twice, to use her voice for the project, but she turned it down. Afterwards, she felt “shocked and angry” because her final voice sounded like her. In rebuttal, OpenAI CEO Sam Altman said that they had hired the original actress before contacting Johansson, and that actress was not supposed to sound like her. But they are withdrawing it out of “respect for Ms. Johansson.” Here is her full statement:

“Sky’s voice is not Scarlett Johansson’s and was never intended to sound like hers. We cast the voice actor behind Sky’s voice before contacting Ms. Johansson. Out of respect for Ms. Johansson, we have stopped using Sky’s voice in our products. We are very sorry for Ms. Johansson for not having communicated better.”

Sam Altman also tweeted the word “she” on the day of the product launch.

It seems very obvious what happened here, despite Altman’s clumsy explanation. As the most famous media example of compelling conversational AI, Altman and OpenAI wanted to reproduce it as literally as possible. They may have cast an actress who sounded like Johansson before contacting her, but upon contacting her they clearly wanted her voice, and it’s impossible to believe that Altman or anyone else noticed the similarities between the actresses’ voices before revealing “Sky.” . “That’s just not credible and his explanation doesn’t make sense.

Of course, the second part of this story is that Altman, by holding His on the pedestal of something to fight for, he has not understood the objective of that film. In it, a lonely man played by Joaquin Phoenix forms what he believes is a genuine relationship with a fellow AI: Samantha, voiced by Johansson.

But in the end he becomes too attached, abandons other aspects of his life and dedicates himself to a non-person he can never be with. In the end, he is heartbroken to discover that as Samantha, an AI, has formed similar relationships with thousands of users and he is not special. She and other AIs, apparently having become sentient, disappear into the cloud at the end of the film. Meanwhile, Phoenix must start forming relationships with real humans again.

This all reminds me a lot of the whole “metaverse” search, something that hasn’t gone too well due to many points being overlooked about media like snow crash and Ready player one, that the existence of a metaverse that escapes reality is dystopian, especially when it is at the service of advertising and corporate interests. Corporations like the same people who manufacture and monetize the versions of real life that have failed to catch on.

But “Samantha-like” AIs are probably a more pressing psychological threat than the non-existent metaverse. And the fact that Altman wants to directly emulate the film as much as possible, right down to the sultry voice of the actress the main character fell in love with, is baffling (even before this, the “flirty” thing didn’t escape many. what AI was being during its demonstrations).

This is a colossal own goal from OpenAI here, but it reflects a broader point about how they’re building something to emulate a dystopian sci-fi concept that, in its source material, does more harm than good. This happens often in the world of technology, but never as literally as this.

follow me On twitter, Threads, Youtube, and instagram.

Pick up my science fiction novels herokiller series and The Earthborn Trilogy.