FBI official sees 'tide change' in how platforms handle extremist content

FBI official sees 'tide change' in how platforms handle extremist content
© Getty

A top FBI counterterrorism official said Wednesday that he has seen a major change in social media companies' willingness to address extremist content on their platforms.

"We are seeing a tide change in social media companies being more proactive, policing their own," Michael McGarrity, the FBI's assistant director for counterterrorism, said at a House Homeland Security Committee hearing.


"And when they see something that is noteworthy and alarming beyond First Amendment, they will give us leads," McGarrity added.

The FBI official asserted that social media companies are "self-identifying content" more than they did a few years ago, adding they are currently facing a "learning curve" in how best to handle it.

His remarks came as the House panel, which has been led by Democrats since January, held its first hearing on domestic terrorism in years.

The hearing focused heavily on the role of social media companies in dealing with incendiary and hateful content, with lawmakers signaling a desire to hold the companies "accountable" when they do not address domestic terrorists on their platforms.

McGarrity told the House panel that "there have been more arrests and deaths in the United States caused by domestic terrorists than international terrorists in recent years." 

The FBI official testified alongside Brian Murphy, the Department of Homeland Security's (DHS) principal deputy undersecretary for the Office of Intelligence and Analysis, and Brad Wiegmann, the deputy assistant attorney general of the Department of Justice's (DOJ) National Security Division.

Each of the witnesses emphasized the government's limited power to address online content from Americans, even when it is extreme or hateful. They said that tech companies have more room to deal with the issue because social media giants such as Facebook and Twitter are not held to the First Amendment and police their platforms according to community guidelines.

"When you’re talking about extremist content online, the First Amendment does impose some significant constraints," Wiegmann, the DOJ official, told the committee.  

"We are prohibited from reviewing, looking at First Amendment activity," McGarrity added. 

But the government can partner with companies such as Facebook, Twitter and Google in a limited capacity, the witnesses told lawmakers.

The FBI often advises the companies on how to spot and approach terrorist content, and they are able to monitor certain platforms — known as incubators — for online radicalization, the witnesses said.

The companies themselves can flag alarming content to federal law enforcement, but there are strict parameters over the kinds of speech that could prompt an investigation by the government.

"Even if a social media company was able to report to us 'this terrorist has put a manifesto' or 'this person has put up a thing criticizing various ethnic groups,' that’s not something we can investigate ... solely on the basis of that information," Wiegmann said.  

The hearing on domestic terrorism comes after a string of shootings by gunmen allegedly radicalized online, many on fringe platforms such as 8chan and Gab.

The gunman suspected of killing one woman and injuring several others at a San Diego synagogue last month appears to have posted an extremist anti-Semitic manifesto online before the attack. He said he was motivated by the man who opened fire at two New Zealand mosques earlier this year, killing 50 in an attack he partially livestreamed on Facebook.

Rep. Mike RogersMichael (Mike) Dennis RogersWashington's playing with a weak hand in the Ukraine crisis House GOP members introduce legislation targeting Russia over Ukraine Corporations seek to rebuild bridges with GOP objectors ahead of midterms MORE (R-Ala.) on Wednesday raised concerns about the role of fringe websites, which many alleged white supremacist shooters in recent years have frequented.

"Fringe websites have become havens for the most abhorrent behavior in our societies," Rogers said. "A quick search yields hundreds of results of the most disturbing and hateful ideologies ever." 

He asked the witnesses if they had thoughts on how to combat the "viral hate speech and incitement of violence found on fringe sites like 8chan and Gab," to which the witnesses did not immediately respond.

"Y'all don’t have any suggestions for us? That’s scary. We can’t make good policy without the advisement," he added. 

Murphy from DHS jumped in, telling Rogers that the department has been "increasing" its efforts to pursue "forums that are available to the public where we see potential acts of violence."

"We’re continuing to refine that and get better at it," Murphy said. 

McGarrity said the FBI currently has 850 ongoing domestic terrorism investigations, about half of which are "anti-government" and 40 percent of which have been identified as "racially motivated violent extremist cases." 

The House Homeland Security Committee received a closed-door briefing from Facebook, Twitter, Google and Microsoft representatives last month, and they said they are continuing to look into how the companies deal with domestic terrorism on their platforms. 

Lawmakers at the hearing left the door open to future hearings on the role of tech in the rise of domestic terrorism incidents over the past few years.