In Roadrunner: A Film About Anthony Bourdain, filmaker Morgan Neville,examines the life and death of the famous TV chef Bourdain. In the process of doing so, he introduced a new documentary device: using Artificial Intelligence to simulate Bourdain’s voice.
In a recent interview with the New Yorker, Neville explained that he used AI to synthetically create a voiceover reading of a Bourdain email that sounded like Bourdain was the reader. He engaged a software company and provided about a dozen hours of recordings, allowing them to create a convincing electronic version model of Bourdain’s voice. That voice reads three lines in the film, including an email sent to a friend by Bourdain: “My life is sort of shit now. You are successful, and I am successful, and I’m wondering: Are you happy?” But Bourdain, of course, never read that or any of the other three lines, to which Neville’s message to viewers is “Nyah, nyah, nyah!” “If you watch the film … you probably don’t know what the other lines are that were spoken by the AI, and you’re not going to know,” he said.
Well, critics, including Ottavia Bourdain, the chef’s former wife, objected to the ethics of an unannounced use of a “deepfake” voice to say sentences that Bourdain never spoke.
I was going to make this an Ethics Quiz, and then after thinking about for a few seconds, decided that the issue doesn’t rate a quiz, because I’m not in nay doubt over the answer. Is what Neville did unethical?
Yes, of course it is. It is unethical because it deliberately deceives listeners into believing that they are hearing the man talking when he never said the words they are hearing. It doesn’t mitigate the deception, as Neville and his defenders seem to think, that Fake Bourdain is reading the actual unspoken words in an email. It’s still deception. Is the creation and use of a zombie voice for this purpose also unethical, like the creation of CGO versions of famous actors to manipulate in movies they never made, discussed (and condemned) here?
That’s a tougher call, but I come down on the side of the dead celebrity who is being made into an unwilling ventriloquist’s dummy by emerging technology.
This would be a propitious time to point out what is ethical and what isn’t when it comes to using a dead celebrity’s voice, real or fake, in various forms of communications and education: