An AI program voiced Darth Vader in ‘Obi-Wan Kenobi’ so James Earl Jones could finally retire
After 45 years of voicing one of the most iconic characters in cinema history, James Earl Jones has said goodbye to Darth Vader. At 91, the legendary actor recently told Disney he was “looking into winding down this particular character.” That forced the company to ask itself how do you even replace Jones? The answer Disney eventually settled on, with the actor’s consent, involved an AI program.
If you’ve seen any of the recent Star Wars shows, you’ve heard the work of Respeecher. It’s a Ukrainian startup that uses archival recordings and a “proprietary AI algorithm” to create new dialogue featuring the voices of “performers from long ago.” In the case of Jones, the company worked with Lucasfilm to recreate his voice as it had sounded when film audiences first heard Darth Vader in 1977.
According to , Jones had signed off on Disney using recordings of his voice and Respeecher’s software to “keep Vader alive.” Lucasfilm veteran Matthew Wood told the outlet that James guided the Sith Lord’s performance in Obi-Wan Kenobi, acting as “a benevolent godfather,” but it was ultimately the AI that gave Vader his voice in many of the scenes.
While there’s something to be said about preserving Vader’s voice, Disney’s decision to use an AI to do so is likely to add fuel to disagreements over how such technology should be used in creative fields. For instance, Getty Images recently art over . With Jones, there’s the possibility we could hear him voice Vader long after he passes away.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.