Experts are studying mannerisms of 2020 candidates to help offset threat of 'deepfake' videos

Experts are studying mannerisms of 2020 candidates to help offset threat of 'deepfake' videos
© iStock

Experts are studying the unique mannerisms that define each presidential candidate ahead of 2020 with the hope that such information could help curb the online spread of fake videos known as deepfakes.

Experts have long warned that deepfakes, videos manipulated with artificial intelligence that look strikingly real, pose a risk heading into the presidential election, particularly as the technology grows increasingly more sophisticated and accessible on the internet. 

ADVERTISEMENT

Hany Farid, a digital forensics expert at the University of California, Berkeley, said this week he worries deepfakes will be used to sow discord and chaos in the 2020 election.

“So what we've been doing is building what we call soft biometric models for all of the presidential candidates. Unfortunately there's a lot of them,” Farid said, according to a transcript of a panel discussion with the Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA).

Farid cited former President Obama's distinct style of speaking as a way experts can distinguish real from fake videos, pointing to a forged viral video BuzzFeed released last year in which Obama made a number of controversial comments, when actually it was comedian Jordan Peele speaking.

“So the basic ideas, like with President Obama, is we've been analyzing hours and hours of his videos, and we've been doing this for Joe BidenJoe BidenBiden, Eastland and rejecting the cult of civility Inslee unveils plan to fight fossil fuel pollution Biden lays out immigration priorities, rips Trump for 'assault on dignity' MORE and Elizabeth WarrenElizabeth Ann Warren2020 Democrat: 'My DM's are open and I actually read & respond' Group of wealthy Americans write open letter asking to be taxed more Inslee unveils plan to fight fossil fuel pollution MORE and all of the candidates. We've been mapping out particular talking styles of the candidates,” Farid added.

“[There is a] link between what Obama says and how he says it, and we build what we call soft biometrics that we then can [use to] analyze a deepfake and say, ‘Oh, in that video, the mouth, which is synthesized to be consistent with Jordan Peele's voice, is in some ways decoupled from the rest of the head. It's physically not correct,'” Farid said.

Newspapers, networks and even campaigns, Farid hopes, will be able to lean on experts like him to analyze videos and verify if they are genuine or not.

Farid’s comments came during a panel discussion with other experts as well as House Intelligence Committee Chairman Adam SchiffAdam Bennett SchiffSchiff would support impeachment if White House ignores a final court decision on documents, testimony US finds itself isolated in Iran conflict House Intelligence Committee to subpoena Trump associate Felix Sater MORE (D-Calif.), one of the few Capitol Hill lawmakers at the forefront of the issue.

Schiff warned that even if a deepfake is disproven, the negative impact could have already run its course — a factor that raises the risk if a deepfake video runs rampant in the lead-up to election day.

“[Psychologists] will tell you that even if you're later persuaded that the video you have watched is a forgery, you will never completely shed the lingering negative impression of the person,” Schiff told the panel.

“Your brain will tell you, 'I shouldn't hold it against Joe Biden or Donald TrumpDonald John TrumpConway defends herself against Hatch Act allegations amid threat of subpoena How to defuse Gulf tensions and avoid war with Iran Trump says 'stubborn child' Fed 'blew it' by not cutting rates MORE or Bernie SandersBernie SandersThe Hill's Morning Report - Crunch time arrives for 2020 Dems with debates on deck The Memo: All eyes on faltering Biden ahead of first debate Progressive group launches campaign to identify voters who switch to Warren MORE or Elizabeth Warren because that video I saw that went viral, I now know to have been a fake. But I cannot shake the feeling that that person is,' you know, fill in the blank. So part of the damage is done once you see it or you hear it,” he added.

Schiff and a bipartisan group of lawmakers have pressed the intelligence community to assess the threat of deepfakes, though the response Director of National Security Dan CoatsDaniel (Dan) Ray CoatsCNN's Jake Tapper repeatedly presses Pence on whether he thinks climate change is a threat Hillicon Valley: Tim Cook visits White House | House hearing grapples with deepfake threat | Bill, Melinda Gates launch lobbying group | Tech turns to K-Street in antitrust fight | Lawsuit poses major threat to T-Mobile, Sprint merger House Intel to take first major deep dive into threat of 'deepfakes' MORE gave to Schiff and other lawmakers is currently classified. 

Schiff said Coats recommended in part that Congress engage with the National Academy of Sciences and the National Institute of Standards and Technology (NIST).

And while it is on their radar, Congress remains in the early stages of pressing the intelligence community to examine the threat. An Intelligence aide told The Hill in April that the committee plans to examine the threat in the coming months. But all the while, the race is on with regard to deepfakes.

“It's an arms race and we are way outgunned, about 100-to-1,000 to 1,” Farid said. “So the number of people like me working on the detection, for every one of me there's between 100 and 1,000 people developing technology to create these types of things.”

And despite the uneven split among detection and escalation, the technology landscape is quickly changing.

“The technology is advancing, the sophistication is advancing. When we develop a technology, it's a year door to door from inception to completion. There's been three, four iterations of the technology in that time,” Farid added.