In Roadrunner: A Film About Anthony Bourdain, filmaker Morgan Neville,examines the life and death of the famous TV chef Bourdain. In the process of doing so, he introduced a new documentary device: using Artificial Intelligence to simulate Bourdain’s voice.
In a recent interview with the New Yorker, Neville explained that he used AI to synthetically create a voiceover reading of a Bourdain email that sounded like Bourdain was the reader. He engaged a software company and provided about a dozen hours of recordings, allowing them to create a convincing electronic version model of Bourdain’s voice. That voice reads three lines in the film, including an email sent to a friend by Bourdain: “My life is sort of shit now. You are successful, and I am successful, and I’m wondering: Are you happy?” But Bourdain, of course, never read that or any of the other three lines, to which Neville’s message to viewers is “Nyah, nyah, nyah!” “If you watch the film … you probably don’t know what the other lines are that were spoken by the AI, and you’re not going to know,” he said.
Well, critics, including Ottavia Bourdain, the chef’s former wife, objected to the ethics of an unannounced use of a “deepfake” voice to say sentences that Bourdain never spoke.
I was going to make this an Ethics Quiz, and then after thinking about for a few seconds, decided that the issue doesn’t rate a quiz, because I’m not in nay doubt over the answer. Is what Neville did unethical?
Yes, of course it is. It is unethical because it deliberately deceives listeners into believing that they are hearing the man talking when he never said the words they are hearing. It doesn’t mitigate the deception, as Neville and his defenders seem to think, that Fake Bourdain is reading the actual unspoken words in an email. It’s still deception. Is the creation and use of a zombie voice for this purpose also unethical, like the creation of CGO versions of famous actors to manipulate in movies they never made, discussed (and condemned) here?
That’s a tougher call, but I come down on the side of the dead celebrity who is being made into an unwilling ventriloquist’s dummy by emerging technology.
This would be a propitious time to point out what is ethical and what isn’t when it comes to using a dead celebrity’s voice, real or fake, in various forms of communications and education:
Ethical: using the celebrity’s voice in a manner that the dead celebrity explicitly approved, when it involves no deception of the audience.
Ethical: Using an actor to read what was written or spoken by a dead celebrity when what is being read was said or written in the same context in which the film or documentary reads it, and when it is obvious or revealed that the the voice used is not the real individual. Example: Paul Giamatti portraying Theodore Roosevelt in Ken Burns’ “The Roosevelts” and being credited as doing so.
That’s it for the ethical devices. Now for what’s unethical…
Unethical: Movies and TV shows playing fake versions of famous records sung by imitators without revealing this to the audience. This is a pet peeve of mine: apparently it is often cheaper to make a new, fake version of a recording than to pay the money to use the real one. I don’t care if it is played under dialogue or not: most of the time the fake version doesn’t do the real artist or the song justice, and the inferior version misleads the listener.
Unethical: Movies using different actors or singers to dub the voices of stars, regardless of the reasons, and not crediting the off-screen talents. This is true even if the dubbing actors or singers accept compensation not to be credited. Examples: Marni Nixon singing for Audrey Hepburn, Deborah Kerr and Natalie Wood in “My Fair Lady,” “The King and I,” and “West Side Story,”; Paul Frees doing Tony Curtis’ female voice in “Some Like It Hot.”
Unethical: Using a dead actor or singer’s performance in another medium or venue to simulate a live performance before a current audience when the performer never approved or consented to such use, regardless of whether the performer’s estate or family can legally grant permission for such a use. Here, I endorsed laws protecting all dead public figures from undignified future cyber- or hologram manipulation absent their consent.
Unethical: Any fake voice of any individual, no matter how it is created, used to deceive viewers or listeners that the individual spoke words he or she did not, in any context the individual did not intend. Example: a deepfake Anthony Bourdain reading the real chef’s email.
Pointer? I lost the reference! I’m sorry! Let me know if you sent me this topic, and I’ll give credit where it’s due.