Facebook, Google face tough questions over white nationalism

Facebook and Google on Tuesday sought to reassure lawmakers that they are cracking down on white nationalism and extremist content online before a hearing of the House Judiciary Committee.

The hearing on white nationalism and hate crimes came as the tech giants face threats of tougher regulations in the U.S. and abroad after they struggled to remove viral footage of a mass shooting at two New Zealand mosques last month.

Lawmakers asked Google and Facebook to explain how they deal with hateful content, pushing them to account for the role their platforms have played in the resurgence of white nationalism in the U.S. over the past few years.


"These platforms are utilized as conduits to spread vitriolic hate messages into every home and country," Judiciary Committee Chairman Jerrod Nadler (D-N.Y.) said during his opening remarks. "Efforts by media companies to counter this surge have fallen short, and social network platforms continue to be used as ready avenues to spread dangerous white nationalist speech."

Facebook public policy director Neil Potts and Google public policy and government relations counsel Alexandria Walden testified, alongside several civil rights advocates, who accused the companies of allowing their platforms to empower white nationalists and white supremacists.

Potts said Facebook's policies dictate "white supremacists are not allowed on the platform under any circumstances," noting that Facebook has been increasing its efforts to remove hate content in the last few years. 

Facebook two weeks ago announced a new ban on "white nationalist" and "white separatist" content, which it had previously allowed on the platform. Potts explained the change to lawmakers who pressed Facebook over the white nationalist pages and accounts still on the platform. 

"When we become aware of these pages, we will remove them," Potts said. He noted that Facebook removes white nationalists "reactively," when users report them, but also engages in "some proactive surfacing of those to human reviewers" who determine whether posts should be removed. 

Walden touted Google's artificial intelligence tools, which are trained to remove violent and extreme content that violates Google-owned YouTube's community guidelines. But Walden also warned that "overaggressive" enforcement can censor some voices. 

"How is YouTube working to stop the spread of far-right conspiracies intent on skewing users’ perceptions of fact and fiction?" Rep. Hank JohnsonHenry (Hank) C. JohnsonNAACP, Rep. Bennie Thompson sue Trump, Giuliani over Capitol riot House Judiciary Democrats ask Pence to invoke 25th Amendment to remove Trump Five things to watch during Electoral College battle MORE (D-Ga.) asked Walden.

Walden responded by describing Google's new policy limiting recommendations of "borderline" content, an effort to limit the reach of conspiracy theories on the platform. 

But in a moment that illuminated the complicated issues at hand, YouTube was forced to shut down a chat feature on a livestream of the hearing. The feature was disabled "due to the presence of hateful comments,” Google confirmed to The Hill.

Nadler during the hearing read aloud a comment from a user named Celtic Pride that said "these jews want to destroy all white nations." 

After Nadler read the passage, Rep. Louie GohmertLouis (Louie) Buller GohmertNIH director: Mask politicalization may have cost 'tens of thousands' of lives in US Democrats should make the 'Bee-Gees' the face of the Republican Party GOP lawmakers call for Pelosi to be fined over new screenings MORE (R-Texas) suggested the comments could be "another hate hoax ... just keep an open mind."

According to data from the Anti-Defamation League, white supremacists have been responsible for more than half of all domestic extremist murders in the past 10 years. In 2018, white supremacists committed 78 percent of all extremist murders in the country. And white supremacists have often used Facebook, Google, Twitter and more fringe social media platforms to organize and recruit new members. 

The pressure on tech companies to do more about extremist content is intensifying.

On Monday, the United Kingdom unveiled a plan to provide oversight of internet platforms' online content, including a threat to block platforms when they fail to take down harmful content.

The U.K. released the plan shortly after Australia last week passed sweeping legislation to fine and even jail the executives of social media companies if they do not remove "abhorrent violent material."

Rep. Pramila JayapalPramila JayapalBiden 'disappointed' in Senate parliamentarian ruling but 'respects' decision House Democrats to keep minimum wage hike in COVID-19 relief bill for Friday vote Bill would strip pension for president convicted of felony MORE (D-Wash.) asked the company representatives if they would submit to third-party civil rights audits, while others pressed the tech companies to coordinate in taking down white nationalist content the same way they coordinate to remove content from ISIS and al Qaeda.

But much of the attention at the hearing went to Turning Point USA communications director Candace Owens, one of the two GOP witnesses, as Democrats hammered her over past controversial comments.

At one point, Rep. Ted LieuTed W. LieuPelosi, lawmakers denounce violence against Asian Americans Riot probe to likely focus on McCarthy-Trump call Progressives urge Biden pick for attorney general to prosecute Trump MORE (D-Calif.) played a recent 30-second audio clip of Owens speaking about Adolf Hitler.

"Of all the people Republicans could have selected, they picked Candace Owens," Lieu said, before playing the audio in which Owens said of Hitler, "If Hitler just wanted to make Germany great and have things run well — OK, fine." 

Owens responded, "I think it's pretty apparent that Mr. Lieu thinks that black people are stupid and will not pursue the full clip in its entirety." Nadler cut her off, saying she should not "refer disparagingly" to a committee member.

Owens criticized Democrats for holding the hearing, saying it was an attempt to frighten minorities ahead of the 2020 presidential election. 

Democratic lawmakers countered by accusing President TrumpDonald TrumpDonald Trump Jr. calls Bruce Springsteen's dropped charges 'liberal privilege' Schiff sees challenges for intel committee, community in Trump's shadow McConnell says he'd back Trump as 2024 GOP nominee MORE and other elected officials of fueling and empowering white nationalist individuals and groups, which often espouse anti-immigrant views.

Hate crimes increased in the U.S. for the third year in a row in 2017, rising 17 percent from the previous year, according to an FBI report released at the end of last year.

“I am not saying that anybody, one person, one elected official, caused that, but there are corollaries there that we need to understand,” Eileen Hershenov, the Anti-Defamation League's senior vice president of policy, testified. 

Republicans and Owens rejected those accusations. Rep. Doug CollinsDouglas (Doug) Allen CollinsThe Hill's Morning Report - Presented by The AIDS Institute - Finger-pointing on Capitol riot; GOP balks at Biden relief plan Perdue rules out 2022 Senate bid against Warnock Loeffler leaves door open to 2022 rematch against Warnock MORE (R-Ga.), the committee's ranking member, raised concerns about linking conservatives with the actions of extremists.

"I worry that the majority’s true motivation for this hearing is to suggest Republicans are hateful, dishonest and somehow connected to those characters who truly spew hatred and act on it in the public square," Collins said. 

Lawmakers made it clear they would watch tech companies closely as they addressed these issues.

"Figure it out," Rep. Cedric RichmondCedric RichmondBottom line Biden pledges action on guns amid resistance Congressional Black Caucus to push aggressive agenda MORE (D-La.) warned Google and Facebook.

"Because you don’t want us to figure it out for you."

Updated at 4:24 p.m.