Here's why social media users need a ‘bill of rights’

Here's why social media users need a ‘bill of rights’
© Getty Images

President Donald TrumpDonald John TrumpTrump rallies in Nevada amid Supreme Court flurry: 'We're gonna get Brett' Trump: 'Good news' that Obama is campaigning again Trump boosts Heller, hammers 'Wacky Jacky' opponent in Nevada MORE has vowed to investigate Twitter’s “shadow banning” of prominent Republicans, tweeting, “We will look into this discriminatory and illegal practice,” following reports that the social media platform limited the visibility of Rep. Matt GaetzMatthew (Matt) GaetzThe federal government must stop stifling medical marijuana research Hillicon Valley: Twitter chief faces GOP anger over bias | DOJ convenes meeting on bias claims | Rubio clashes with Alex Jones | DHS chief urges lawmakers to pass cyber bill | Sanders bill takes aim at Amazon Conservatives blame McCarthy for Twitter getting before favorable committee MORE and other Republicans in searches while seeking to “improve the quality of discourse,” according to Vice News, which broke the story.

Twitter’s explanation in May of its intent to “improve the health of the public conversation” sounded Orwellian: “The result is that people contributing to the healthy conversation will be more visible in conversations and search.” As a journalist and executive director of a research center, I’ve recently experienced several instances when social media giants appeared to label my content “unavailable” or “political” without explanation. With billions of users of Twitter and other social platforms, this points to a widespread need for more transparency.

ADVERTISEMENT
In the United States, Congress should step in to provide the same kind of consumer oversight the government provides with food products, airlines and other sectors. Shadow banning is the blocking of a user or his/her content in a way that the ban isn’t readily apparent to the user. The practice comes amid a torrent of scandals and critiques of how social media giants, particularly Facebook and Twitter, are becoming too powerful.

 

In the United Kingdom, Damian Collins, the chairman of Parliament’s Common Digital, Culture, Media and Sport Committee, has warned that “fake news” spreading on social media could threaten democracy. With the touch of a button, what’s written online can reach millions of people around the world, he notes, and “if they can be effectively used to spread disinformation without the source of that information ever being revealed … then that is a threat we have to confront.”

The push is on to impose new regulations against social media giants. Barry Lynn, executive director of the Open Markets Institute, argues that as gatekeepers to the news, these companies can “pose dangers to even the most successful [news] outlets.” Legislators need to speed up a regulation process, he writes.

The need to confront social media companies unites right and left, although for different reasons. George Soros argued at Davos in January that that social media companies influence how people think and behave: “This has far-reaching adverse consequences on the functioning of democracy.” The left argues that social media spread “fake news” and that undermines democracy. As evidence, critics point to accusations that Russia manipulated the U.S. elections in 2016 and  influenced the Brexit vote in the United Kingdom. On the right, there is concern that conservative voices are being censored.

Facebook and Twitter are trying to balance freedom of expression with the demands of consumers and governments. For example, Facebook banned radio host Alex Jones in late July for bullying and hate speech, and it has tried to crack down on “fake news.” In a recent interview, Facebook founder Mark ZuckerbergMark Elliot ZuckerbergFight looms over national privacy law Facebook teaming with nonprofits to fight fake election news China may be copying Facebook to build an intelligence weapon MORE said that the platform might host offensive material “but that doesn’t mean we have a responsibility to make it widely distributed in News Feed.”

The problem for consumers is that these companies are not transparent. Facebook rolled out a new algorithm in 2018 that claimed it was geared to supporting “meaningful interactions.” Many bloggers began to try to decipher what that meant. The “meaningful interaction” is Facebook’s version of Twitter’s “healthy conversation.” But what about the users? They genuinely want to see the content of their friends and those they follow, but this algorithm limits content or tweaks what the user sees. Consumers have no way to really know how many people their posts will reach. They don’t know if what they wrote was graded “healthy” or “meaningful.”

I recently tweeted about oil trade between Iran and Kirkuk, only to have my tweet labeled “sensitive.” Why? Twitter gave no explanation but kept some people from seeing what I posted. While promoting videos and content for the Middle East Center for Reporting and Analysis, we’ve found that Facebook often labels content “political” even when there is nothing political in the content. Although one can appeal the label, there’s no clear explanation of how the labeling came about, what constitutes a “political” word or phrase, and what change would make it “not political.”

Social media companies have an interest in limiting user knowledge so that marketers can’t exploit and manipulate the algorithm. But there has to be a happy medium, just as there is when a consumer buys ketchup or purchases an airplane ticket. In the early days of the internet,  lawmakers were reticent to regulate too much for fear of strangling innovation. But Facebook now has one of the largest market values in the United States, despite its recent stock plunge. It is time for Congress to provide the millions of Americans who use social media with a “bill of rights” and recourse to a regulator in dealing with these giants.

Globally, other countries should follow suit with a similar “bill of rights,” such as the one for airline passengers, that provide users with mandated transparency about who sees their posts and whether they are being “shadow blocked” or limited. We must be careful not to empower too much regulation or censorship, but it’s important that these influential companies meet certain standards like other industries do.

Seth J. Frantzman spent three years in Iraq and other countries in the region researching the war on terror and Islamic State. He is executive director of the Middle East Center for Reporting and Analysis. A former assistant professor of American Studies at Al-Quds University, he covers the Middle East for The Jerusalem Post and is a writing fellow at the Middle East Forum. He is writing a book on the state of the region after ISIS.