Facebook takes down Russian, Iranian accounts trying to interfere in 2020

Facebook takes down Russian, Iranian accounts trying to interfere in 2020
© Getty Images

Facebook on Monday announced it had taken down networks belonging to Russian and Iranian actors on both Facebook and Instagram, part of a new effort to secure the platforms from foreign interference ahead of the 2020 election.

The networks comprised pages and groups on Facebook and Instagram that were “engaging in inauthentic behavior” regarding elections. They were targeting the U.S., North Africa and Latin America. Three of the networks were Iranian, while one was Russian. 

ADVERTISEMENT

Nathaniel Gleicher, Facebook’s head of cybersecurity policy, told reporters that the Russian network had the “hallmarks of a well-resourced operation” with potential links to the Russian Internet Research Agency, a group that carried out interference campaigns during the 2016 U.S. elections. 

Facebook emphasized in its announcement that the accounts were taken down “based on their behavior, not the content they posted.” 

As part of the crackdown, Facebook is updating its “inauthentic behavior policy” to improve its ability to identify “bad actors,” including taking down accounts that are working together to spread disinformation, along with other major changes.

In a press call on Monday, Facebook CEO Mark ZuckerbergMark Elliot ZuckerbergHillicon Valley: Amazon to challenge Pentagon cloud contract in court | State antitrust investigation into Google expands | Intel agencies no longer collecting location data without warrant Civil rights groups demand changes to Facebook's political speech policy Hillicon Valley: Federal inquiry opened into Google health data deal | Facebook reports millions of post takedowns | Microsoft shakes up privacy debate | Disney plus tops 10M sign-ups in first day MORE detailed the company's work to prevent disinformation and voter suppression in 2020.

Zuckerberg told reporters he is “confident we are a lot more prepared” than in the 2016 cycle, while also acknowledging the need to stay vigilant. 

“We’re in a much better place in dealing with this, but this isn’t an area where we can take our eye off the ball, or where you ever fully solve the problem,” Zuckerberg said. “You refer to this as an arms race, I think that is probably the right analogy, we’re getting better, they’re getting better, I think right now we’re doing quite well, but this is certainly an area that we all need to be focused on.”

Around 126 million people may have seen content from a Facebook page associated with the Internet Research Agency, according to testimony given by Facebook general counsel Colin Stretch on Capitol Hill in 2017. Former special counsel Robert MuellerRobert (Bob) Swan MuellerSpeier says impeachment inquiry shows 'very strong case of bribery' by Trump Gowdy: I '100 percent' still believe public congressional hearings are 'a circus' Comey: Mueller 'didn't succeed in his mission because there was inadequate transparency' MORE found in his report that the group was involved in a sophisticated and sweeping campaign that was meant to sway the 2016 election in favor of President TrumpDonald John TrumpTrump reversed course on flavored e-cigarette ban over fear of job losses: report Trump to award National Medal of Arts to actor Jon Voight Sondland notified Trump officials of investigation push ahead of Ukraine call: report MORE

Last week, the Senate Intelligence Committee published a report on Russian disinformation campaigns launched around the 2016 elections that found the campaigns were created at the direction of the Kremlin. The bipartisan report recommended an “integrated approach” involving both the public and private sectors to address disinformation, an assessment Zuckerberg supports.

“It’s clear that everyone needs to work together,” Zuckerberg said Monday. 

Zuckerberg also unveiled a set of new set of policy changes and tools for election security.

The platform unveiled “Facebook Protect,” a service for campaigns to help secure them against hacking. Campaigns that opt in to the service will have to “turn on two-factor authentication, and their accounts will be monitored for hacking, such as login attempts from unusual locations or unverified devices,” according to Facebook. 

The social media giant is also adding a tab to all pages with information about their operators, including location, legal name or website.

“People are still going to be able to post and follow the content that they want, but for example if a page is about a domestic policy issue, people are now going to be able to see a prominent label that it's coming from another country,” Zuckerberg said on the call.

Facebook will also identify content coming from state-controlled media, which it defines as outlets “that are wholly or partially under the editorial control of their government,” starting next month.

The policy change is likely to affect outlets like Russia Today, which reportedly was used by the Kremlin to influence the 2016 election.

In the next month, Facebook will also begin labeling content that has been rated false or partly false by third-party fact-checkers. That change will also be made on Instagram.

Zuckerberg defended a recent policy to not remove political ads featuring misinformation, unless they could potentially incite violence or lead to voter suppression.

“In general we give very broad deference to political speech — we believe that in a democracy people should be able to see for themselves what politicians are saying,” Zuckerberg said.

"Facebook has been the defender of a lot of the debate, but I do think that our policies are broadly in line with what a lot of other internet companies do, and a lot of TV folks.” 

Zuckerberg also said that Facebook's rules on preventing voter suppression would take precedence over political speech.

“The voter suppression rules would be paramount in that case,” he said when asked how Facebook would deal with a candidate posting content that could stop citizens from voting. “There have to be exceptions when there is an overwhelming risk. … Voter suppression is calling to remove another’s voice.”

Facebook also announced a separate commitment Monday of $2 million for media literacy efforts. 

“These projects range from training programs to help ensure the largest Instagram accounts have the resources they need to reduce the spread of misinformation, to expanding a pilot program that brings together senior citizens and high school students to learn about online safety and media literacy, to public events in local venues like bookstores, community centers and libraries in cities across the country,” the company said.