A Zelensky Deepfake Was Quickly Defeated. The Next One Might Not Be
Other conflicts and political leaders may be less fortunate, and could be more vulnerable to disruption by deepfakes, says Sam Gregory, who works on deepfakes policy at nonprofit Witness.
Zelensky’s high profile helped both Ukraine’s deepfake forewarning two weeks ago win international news coverage and his rapid response debunking Wednesday to spread rapidly. His prominence may also have helped prompt a quick response to the video from social networking companies. Meta spokesperson Aaron Simpson declined to say how it detected the video; so did YouTube’s Choi. The statement provided by Twitter’s Kennedy credited unspecified “external investigative reporting.”
Not all people targeted by a deepfake will be able to react as nimbly as Zelensky, or find their debunking so widely trusted. “Ukraine was well-positioned to do this,” Gregory says. “This is very different from other cases, where even a poorly made deepfake can create uncertainty about authenticity.”
Gregory points to a video that appeared in Myanmar last year, which appeared to show a former government minister held in detention saying he provided cash and gold to the country’s former leader Aung San Suu Kyi.
The military government that displaced Aung San Suu Kyi in a coup used that footage to accuse her of corruption. But in the video the former minister’s face and voice were distorted, causing many journalists and citizens to suggest the clip was faked.
Technical analysis has not resolved the mystery, in part because the video is of low quality, and because the former minister and others familiar with the truth don’t speak as freely or to as large an audience as Zelensky could on Wednesday. While automatic deepfake detectors could someday help combat bad actors, they’re still a work in progress.
Deepfakes are still generally used more for titillation or harassment than grand deception, especially as they become easier to create. The Zelensky deepfake could represent a troubling new frontier. The quick and successful response to the clip highlights how, with a few tweaks and better timing, a deepfake attack could be an effective political weapon.
“If this was a more professional video and had been released early on in a more successful Russian advance on Kyiv it could have created a lot of confusion,” says Samuel Bendett, who tracks Russian defense technology at nonprofit CNA. As deepfakes technology continues to get easier to access and more convincing, Zelensky is unlikely to be the last political leader targeted by fake video.
More Great WIRED Stories
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.