U.S. tech companies are tamping down expectations on their ability to prevent foreign influence campaigns on social media.
With the midterm elections a little more than two months away, Silicon Valley is under heavy public and political pressure to crack down on foreign operations. However, some cyber experts say it’s too late for them to secure their platforms before voters cast ballots on Nov. 6.
Questions about those efforts will be put to top executives from Facebook, Twitter and Google when they testify before the Senate Intelligence Committee on Sept. 5.
Facebook CEO Mark ZuckerbergMark Elliot ZuckerbergHillicon Valley — Facebook 'too late' curbing climate falsities Facebook draws lawmaker scrutiny over Instagram's impact on teens How social media fuels U.S. political polarization — what to do about it MORE last week offered a glimpse of what his company is likely to highlight on Capitol Hill. He described Facebook’s efforts to stop foreign influence campaigns as “an arms race” in which the company is constantly adjusting to the evolving threats posed by foreign countries.
Zuckerberg’s remarks came the same day that Facebook announced it had deleted more than 600 accounts that it found to be engaging in foreign misinformation campaigns, including a batch from Iran. It was the company’s second disclosure this year of fake accounts spreading misinformation.
In a column for the blog Lawfare on Wednesday, Alex Stamos, who recently left his role as Facebook chief security officer said that the most recent disclosure, along with Microsoft's revelation that conservative think tanks had been targeted by Russian hackers, is evidence that it's too late to protect the 2018 elections.
Zuckerberg has suggested that such disclosures reflect the new political reality, and some tech experts say they don’t see the social media arms race ending ending anytime soon.
“State actors are going to find it attractive for a while,” said Renee DiResta, who researches digital propaganda with the group Data for Democracy. “We should expect to see more revelations like this and see the playbook continue to evolve.”
Some experts argue that the campaigns are widespread and inevitable because the very structure of social media platforms makes them ripe for this type of abuse.
Wellesley College computer scientists Panagiotis Metaxas and Eni Mustafaraj warned as early as 2012 that social media manipulation to influence politics was already happening and would continue.
“It has very little cost, for example, to throw a Twitter bomb,” Metaxas said, referring to take advantage of an army of Twitter bots.
“It takes just a few hours of programming—or you can probably buy a program that can do it for you. It has a lot of potential because of its low cost,” he presaged.
Today, running misinformation campaigns remains attractive because of how cost effective they are. Countries can spend very small amounts of money on social media advertising which can have an outsized effect in helping their campaigns reach hundreds of thousands of people.
DiResta noted that U.S. adversaries with even fewer resources than Russia are likely to invest in social media campaigns, in large part because of the low cost.
And the U.S. isn’t the only country dealing with this issue. Mexico had to deal with misinformation campaigns heading into national elections this summer.
But while manipulating social media platforms is attractive to some governments at the moment, experts see a potential end to it.
Clint Watts, a distinguished research fellow at the Foreign Policy Research Institute, explained that in the future, the cost of running these types of campaigns might not be worth it.
“If the costs get raised enough that they can’t – whether it’s Russia, Iran, whoever – where they can’t achieve the audience, it won’t be worth their time anymore,” he said.
Things might already be moving in that direction.
DiResta said costs are rising for influence campaigns. In 2016, she said, the process was much easier: Russia did very little to cover their tracks and they were able to spread their message through basic, cheap and easy to use bots.
It’s not that simple anymore.
“If you want to get something trending, you need something that evades Twitter bot detection mechanisms,” DiResta said. “To do that you need people typing original things. So it takes a lot more work to evade bot detection now that it did in 2016.”
The public can also help decrease the value of misinformation campaigns.
Baltic and Nordic countries such as Ukraine, Sweden and Finland have already been on the receiving end of information warfare. Part of their response has been to help educate their populations so that they’re less vulnerable to digital propaganda.
Experts like Katherine Haenschen, a professor at Virginia Tech who researches social media and politics, say that type of education in the U.S. can be a sound solution to misinformation, even as tech companies and foreign governments escalate their arms race against one another.
“Foreign interference doesn’t work if an Americans are able to identify it themselves,” she said.
Morgan Chalfant contributed.