Experts are studying the unique mannerisms that define each presidential candidate ahead of 2020 with the hope that such information could help curb the online spread of fake videos known as deepfakes.
Experts have long warned that deepfakes, videos manipulated with artificial intelligence that look strikingly real, pose a risk heading into the presidential election, particularly as the technology grows increasingly more sophisticated and accessible on the internet.
Hany Farid, a digital forensics expert at the University of California, Berkeley, said this week he worries deepfakes will be used to sow discord and chaos in the 2020 election.
“So what we've been doing is building what we call soft biometric models for all of the presidential candidates. Unfortunately there's a lot of them,” Farid said, according to a transcript of a panel discussion with the Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA).
Farid cited former President Obama's distinct style of speaking as a way experts can distinguish real from fake videos, pointing to a forged viral video BuzzFeed released last year in which Obama made a number of controversial comments, when actually it was comedian Jordan Peele speaking.
“So the basic ideas, like with President Obama, is we've been analyzing hours and hours of his videos, and we've been doing this for Joe BidenJoe BidenPelosi sets Thursday vote on bipartisan infrastructure bill Pressure grows to cut diplomatic red tape for Afghans left behind President Biden is making the world a more dangerous place MORE and Elizabeth WarrenElizabeth WarrenIn defense of share buybacks Democrats urge Biden to go all in with agenda in limbo In Washington, the road almost never taken MORE and all of the candidates. We've been mapping out particular talking styles of the candidates,” Farid added.
“[There is a] link between what Obama says and how he says it, and we build what we call soft biometrics that we then can [use to] analyze a deepfake and say, ‘Oh, in that video, the mouth, which is synthesized to be consistent with Jordan Peele's voice, is in some ways decoupled from the rest of the head. It's physically not correct,'” Farid said.
Newspapers, networks and even campaigns, Farid hopes, will be able to lean on experts like him to analyze videos and verify if they are genuine or not.
Farid’s comments came during a panel discussion with other experts as well as House Intelligence Committee Chairman Adam SchiffAdam Bennett SchiffJan. 6 panel subpoenas four ex-Trump aides Bannon, Meadows Schiff: Criminal contempt charges possible for noncooperation in Jan. 6 probe The Hill's Morning Report - Presented by Alibaba - Biden jumps into frenzied Dem spending talks MORE (D-Calif.), one of the few Capitol Hill lawmakers at the forefront of the issue.
Schiff warned that even if a deepfake is disproven, the negative impact could have already run its course — a factor that raises the risk if a deepfake video runs rampant in the lead-up to election day.
“[Psychologists] will tell you that even if you're later persuaded that the video you have watched is a forgery, you will never completely shed the lingering negative impression of the person,” Schiff told the panel.
“Your brain will tell you, 'I shouldn't hold it against Joe Biden or Donald TrumpDonald TrumpCheney says a lot of GOP lawmakers have privately encouraged her fight against Trump Republicans criticizing Afghan refugees face risks DeVos says 'principles have been overtaken by personalities' in GOP MORE or Bernie SandersBernie Sanders Texas House Republican tests positive for coronavirus in latest breakthrough case In defense of share buybacks Progressives seething over Biden's migrant policies MORE or Elizabeth Warren because that video I saw that went viral, I now know to have been a fake. But I cannot shake the feeling that that person is,' you know, fill in the blank. So part of the damage is done once you see it or you hear it,” he added.
Schiff and a bipartisan group of lawmakers have pressed the intelligence community to assess the threat of deepfakes, though the response Director of National Security Dan CoatsDaniel (Dan) Ray CoatsOvernight Hillicon Valley — Scrutiny over Instagram's impact on teens Former national security officials warn antitrust bills could help China in tech race Cyber preparedness could save America's 'unsinkable aircraft carrier' MORE gave to Schiff and other lawmakers is currently classified.
Schiff said Coats recommended in part that Congress engage with the National Academy of Sciences and the National Institute of Standards and Technology (NIST).
And while it is on their radar, Congress remains in the early stages of pressing the intelligence community to examine the threat. An Intelligence aide told The Hill in April that the committee plans to examine the threat in the coming months. But all the while, the race is on with regard to deepfakes.
“It's an arms race and we are way outgunned, about 100-to-1,000 to 1,” Farid said. “So the number of people like me working on the detection, for every one of me there's between 100 and 1,000 people developing technology to create these types of things.”
And despite the uneven split among detection and escalation, the technology landscape is quickly changing.
“The technology is advancing, the sophistication is advancing. When we develop a technology, it's a year door to door from inception to completion. There's been three, four iterations of the technology in that time,” Farid added.