Experts are studying mannerisms of 2020 candidates to help offset threat of 'deepfake' videos

Experts are studying mannerisms of 2020 candidates to help offset threat of 'deepfake' videos
© iStock

Experts are studying the unique mannerisms that define each presidential candidate ahead of 2020 with the hope that such information could help curb the online spread of fake videos known as deepfakes.

Experts have long warned that deepfakes, videos manipulated with artificial intelligence that look strikingly real, pose a risk heading into the presidential election, particularly as the technology grows increasingly more sophisticated and accessible on the internet. 

ADVERTISEMENT

Hany Farid, a digital forensics expert at the University of California, Berkeley, said this week he worries deepfakes will be used to sow discord and chaos in the 2020 election.

“So what we've been doing is building what we call soft biometric models for all of the presidential candidates. Unfortunately there's a lot of them,” Farid said, according to a transcript of a panel discussion with the Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA).

Farid cited former President Obama's distinct style of speaking as a way experts can distinguish real from fake videos, pointing to a forged viral video BuzzFeed released last year in which Obama made a number of controversial comments, when actually it was comedian Jordan Peele speaking.

“So the basic ideas, like with President Obama, is we've been analyzing hours and hours of his videos, and we've been doing this for Joe BidenJoe BidenTrump says lawmakers should censure Schiff Schiff says committees will eventually make impeachment inquiry transcripts public Trump threat lacks teeth to block impeachment witnesses MORE and Elizabeth WarrenElizabeth Ann WarrenButtigieg tweeted support for 'Medicare for All' in 2018 Overnight Health Care — Presented by National Taxpayers Union — House Dems change drug pricing bill to address progressive concerns | Top Republican rejects Dem proposal on surprise medical bills | Vaping group launches Fox News ad blitz Hillicon Valley: FCC approves T-Mobile-Sprint merger | Dems wrangle over breaking up Big Tech at debate | Critics pounce as Facebook's Libra stumbles | Zuckerberg to be interviewed by Fox News | Twitter details rules for political figures' tweets MORE and all of the candidates. We've been mapping out particular talking styles of the candidates,” Farid added.

“[There is a] link between what Obama says and how he says it, and we build what we call soft biometrics that we then can [use to] analyze a deepfake and say, ‘Oh, in that video, the mouth, which is synthesized to be consistent with Jordan Peele's voice, is in some ways decoupled from the rest of the head. It's physically not correct,'” Farid said.

Newspapers, networks and even campaigns, Farid hopes, will be able to lean on experts like him to analyze videos and verify if they are genuine or not.

Farid’s comments came during a panel discussion with other experts as well as House Intelligence Committee Chairman Adam SchiffAdam Bennett SchiffTrump says lawmakers should censure Schiff Schiff says committees will eventually make impeachment inquiry transcripts public The comments and actions of Schiff demand his formal censure MORE (D-Calif.), one of the few Capitol Hill lawmakers at the forefront of the issue.

Schiff warned that even if a deepfake is disproven, the negative impact could have already run its course — a factor that raises the risk if a deepfake video runs rampant in the lead-up to election day.

“[Psychologists] will tell you that even if you're later persuaded that the video you have watched is a forgery, you will never completely shed the lingering negative impression of the person,” Schiff told the panel.

“Your brain will tell you, 'I shouldn't hold it against Joe Biden or Donald TrumpDonald John TrumpGOP congressman slams Trump over report that U.S. bombed former anti-ISIS coalition headquarters US to restore 'targeted assistance' to Central American countries after migration deal Trump says lawmakers should censure Schiff MORE or Bernie SandersBernie SandersOvernight Health Care — Presented by National Taxpayers Union — House Dems change drug pricing bill to address progressive concerns | Top Republican rejects Dem proposal on surprise medical bills | Vaping group launches Fox News ad blitz Democrats have reason to worry after the last presidential debate Krystal Ball on Sanders debate performance: 'He absolutely hit it out of the park' MORE or Elizabeth Warren because that video I saw that went viral, I now know to have been a fake. But I cannot shake the feeling that that person is,' you know, fill in the blank. So part of the damage is done once you see it or you hear it,” he added.

Schiff and a bipartisan group of lawmakers have pressed the intelligence community to assess the threat of deepfakes, though the response Director of National Security Dan CoatsDaniel (Dan) Ray Coats281 lobbyists have worked in Trump administration: report Former intelligence chief Coats rejoins law firm Remembering leaders who put country above party MORE gave to Schiff and other lawmakers is currently classified. 

Schiff said Coats recommended in part that Congress engage with the National Academy of Sciences and the National Institute of Standards and Technology (NIST).

And while it is on their radar, Congress remains in the early stages of pressing the intelligence community to examine the threat. An Intelligence aide told The Hill in April that the committee plans to examine the threat in the coming months. But all the while, the race is on with regard to deepfakes.

“It's an arms race and we are way outgunned, about 100-to-1,000 to 1,” Farid said. “So the number of people like me working on the detection, for every one of me there's between 100 and 1,000 people developing technology to create these types of things.”

And despite the uneven split among detection and escalation, the technology landscape is quickly changing.

“The technology is advancing, the sophistication is advancing. When we develop a technology, it's a year door to door from inception to completion. There's been three, four iterations of the technology in that time,” Farid added.