Dems want more details on Russian social media content

Dems want more details on Russian social media content
© iStock

Social media companies are focusing on whether Russia manipulated their ad platforms to influence the 2016 election, but some lawmakers and experts think that the real problem lies in content posted by fake users and Russian-linked bots.

Lawmakers like the Senate Intelligence Committee's top Democrat, Sen. Mark WarnerMark Robert WarnerDems request probe into spa owner suspected of trying to sell access to Trump Live video of New Zealand shooting puts tech on defensive The Hill's Morning Report — Trump readies first veto after latest clash with Senate GOP MORE (Va.), and researchers doubt that paid advertising was the primary method in Russia’s election interference.

Instead, they see a strong likelihood that Russia created “organic content” related to the election — often in the form of inflammatory content — to divide and influence Americans.

“Probably more important is, what was the content they were pushing out that was nonadvertising. I think that will probably dwarf what we’ve seen in the paid advertising,” Rep. Adam SchiffAdam Bennett SchiffDems request probe into spa owner suspected of trying to sell access to Trump Dems fear Trump is looking at presidential pardons The Hill's Morning Report — Trump readies first veto after latest clash with Senate GOP MORE (Calif.), the House Intelligence Committee’s top Democrat, told The Hill on Wednesday. 

Lawmakers will have a chance to dig deeper into the effect of such posts next week, when top lawyers from Facebook, Twitter and Google testify on Capitol Hill. The lawyers will testify before the House and Senate Intelligence Committees on Wednesday, as well as the Senate Judiciary Committee on Tuesday.

Facebook has informed Congress of 3,000 ads on its platform that it believes were purchased by Russian actors, while Twitter has turned over undisclosed advertising information. But Warner has said he thinks those ads are just the “tip of the iceberg.” 

Because Facebook and Twitter have not yet released information about “organic” content linked to Russia, and because they have purged their platforms of many Russian-linked accounts, the full reach of the activity is not known — but there are clues. 

One Twitter account, called @TEN_GOP, posed as the Twitter account of Tennessee Republicans, and fooled scores of people into thinking it actually represented the state’s GOP party. The account’s tweets were widely shared. 

According to numbers provided by digital media expert and professor at Columbia University Jonathan Albright, one now-deleted tweet from the account about Hillary ClintonHillary Diane Rodham ClintonTrump mocks wind power: 'When the wind doesn't blow, just turn off the television' Trump's approval rating stable at 45 percent Kellyanne Conway: 'I think my gender helps me with the president' MORE drew 3,574 social media interactions — defined as Facebook likes, Facebook reactions, direct tweet shares and retweets.

President TrumpDonald John TrumpTrump mocks wind power: 'When the wind doesn't blow, just turn off the television' Pentagon investigator probing whether acting chief boosted former employer Boeing Trump blasts McCain, bemoans not getting 'thank you' for funeral MORE’s own advisers fell for the account. Trump’s digital campaign director Brad Parscale, counselor Kellyanne Conway and son Donald Trump Jr. all retweeted posts from @TEN_GOP, according to The Daily Beast.

Before being taken down by Twitter, @TEN_GOP amassed 140,000 followers. It’s unclear how many of the followers were real, but the buzz generated by the account suggested it was breaking through.

Another account duped Twitter’s own CEO Jack Dorsey. The Daily Beast found that in March of last year, Dorsey retweeted @Crystal1Johnson, an account that claimed to belong to an African-American woman and posted inspirational stories.

“Most people aren’t thinking 'oh this might be a bot,' ” says Renee DiResta, a social media network expert. “It’s unreasonable to expect users to be aware of that if it even fooled the CEO of the company.” 

DiResta says that while digging into ads purchased by Russian actors can be helpful, organic posts by bots and fake accounts, known as “sock puppets,” deserve more attention. 

People tend to assume that sponsored posts and advertisements come with an agenda and are suspicious of them, DiResta says. People are far more willing to believe a post when it comes from another user, especially if the information comes from a user that they know and trust. 

“If you see something organic that looks like it’s being shared by your friends, then you might not realize you’re being targeted by something,” she said.

Though such posts often don’t start within user networks, they can reach them when they’re posted on a page followed by many users. Facebook pages like “Heart of Texas” were secretly being run by Russian-linked groups and amassed hundreds of thousands of followers. Though some of these were likely fake, real people followed these accounts and could spread content into their own networks of friends and family.

Research suggests that these types of accounts facilitated a significant dissemination of content.

Albright found that merely six of the almost 500 fake account pages Facebook has revealed reached around 340 million users through posts, which garnered 19.1 million interactions defined as likes, shares, and comments. 

Facebook has agreed to turn over to Congress some organic posts associated with Russian-linked accounts, according to Schiff, and has already begun the process of providing the House Intelligence Committee with that information.