Facebook regulations won’t work — misinformation coded into Facebook is by human nature

In response to charges of fake news and foreign interference in elections, Facebook executives lament that they were blindsided by how their platform has been misused and abused. Such surprise is either naive or a misdirection from a company that prided itself in understanding and monetizing human behaviors.

Unfortunately, Facebook has little control to change human behaviors that they exploit with their algorithms; and regulations will not eliminate the abuse and misinformation either.

ADVERTISEMENT
Facebook has been a great platform for social interaction precisely because it plays upon our desires to connect and have purpose among our community. Facebook has brought people together with common interests, and long-lost friends have found one another to rebuild connections. The resulting good has been immense as Facebook has played a role in raising funds for various causes that have helped thousands.

Unfortunately, those same human desires and tendencies are a source for perpetuating abuse, hate, trolling, cyberbullying, misinformation, rumors, jealousy and misogyny. Sadly, social media is now a source of public health risk, with documented rises in depression and suicide rates.  

At the crux of the problem are natural and well-understood human behaviors, including homophily and confirmation bias. Homophily is the tendency of individuals with similar beliefs (e.g., around cultural, political, race, or drug issues), social demographics and other characteristics to come together.  

Facebook creates a natural environment for individuals to opt in with similar others while excluding those with contrarian views. The behaviors on social networks have popularized these terms: echo chambers, consumption treadmill, birds of a feather flock together, and the information cocoon/bubble.

These groups reinforce one another’s beliefs through information or misinformation. It thrives on confirmation bias, the tendency to listen only to information that confirms our existing beliefs. Easy-to-create memes with (mis)quotes and doctored videos and pictures flourish, and misinformation becomes the perceived reality as groups consume and amplify reinforcing views.

Our research shows that even in financial investing activities, there are strong homophily behaviors. We found that investors who hold stocks for long-term investment select only bullish headlines to read. On the contrary, those who have shorted the stock (i.e., those who believe the stock price will decline) end up selecting mostly bearish headlines. Any contrary headlines are ignored.

On investor message boards, any contrarians are mauled by others, and they quickly exit. Message boards thus become an echo chamber. Even when individuals’ only reason to invest is to maximize their gains, investors seek information that confirms their prior beliefs, often to their loss. This behavior is classic cognitive dissonance.

Facebook doesn’t counteract these psychological behaviors; it feeds off of them. People with similar interests will further their beliefs with real or fake information that aligns with their prior convictions. This is particularly true with divisive social issues like religion, guns and politics. In other words, even if Facebook prevents foreign agencies from perpetuating fake news, your friends and neighbors will take over.

Facebook has long boasted that their algorithms can customize information around an individual’s interests and prior beliefs to drive engagement levels. They have learned to mine data and to monetize their platforms. It is now haunting them, and there are no easy solutions. Algorithms would have to be redesigned around very subjective ethical norms, but such changes would have potentially negative consequences to Facebook’s revenue and profit.

With increasing talk of regulations around privacy and interference, Facebook can only do so much in preventing misleading or misinformation. This menace will continue on different platforms unless one is willing to sacrifice revenue with unbiased algorithms and data privacy, incur more costs policing misinformation, and introduce censorship.

In the meantime, whoever the entities — foreign or U.S. residents — will continue to exploit bias behaviors to nudge people to support an agenda. That’s just human nature.

Prabhudev Konana is a distinguished teaching professor and the William H. Seay Centennial professor of business in the McCombs School of Business at The University of Texas at Austin.