FEATURED:

Why the Logan Pauls of the world can push the boundaries of privacy and good taste

Why the Logan Pauls of the world can push the boundaries of privacy and good taste
© Getty Images

Social media is having its difficult adolescence. Facebook is approaching its 14th birthday, YouTube is 13, and Twitter is almost 12. In each case, a happy childhood has been replaced by awkward teen or tween years. In recent weeks, each of these companies has suffered embarrassing setbacks.

Facebook was sharply criticized for segmenting its users into “filter bubbles,” for allowing foreign money to interfere with the last U.S. presidential election, and for failing to stop the spread of “fake news” on its platform. Some former Facebook executives and engineers have argued publicly that social media is addictive and that companies (including Facebook) have taken advantage of this fact for profit.

ADVERTISEMENT
Twitter was recently sued by notorious troll Charles Johnson on the grounds that its silencing of him violated his free speech rights under the California Constitution. At the same time, Twitter has been forced to defend its failure to discipline public figure users such as  Donald TrumpDonald John TrumpFive takeaways from Cruz, O'Rourke's debate showdown Arpaio files libel suit against New York Times IMF's Christine Lagarde delays trip to Middle East MORE, who has insulted and threatened foreign and domestic entities, including threatening North Korea with nuclear attack.

 

Nothing encapsulates social media’s difficult adolescence like the case of Logan Paul and his infamous “suicide forest” video. Paul is a 22-year-old YouTube star who, despite (or perhaps because of) his 15 million subscribers, seems stuck in his own adolescence. He became famous as a teenager for creating viral micro-videos on the Vine platform, and switched to YouTube around the time Vine closed up shop.

For the past 18 months or so, Paul had published daily video logs, or “vlogs,” documenting his fabulous lifestyle, while making millions of dollars from his share of advertising revenue that his videos generated for YouTube. Recent videos featured Paul gurning for the camera while buying expensive cars or a Gucci outfit worth over $10,000. On Dec. 31, Paul published a video in which he and several sidekicks traipsed into Japan's Aokigahara forest, known as the “suicide forest,” and documented their encounter with a dead body they found hanging from a tree.

Paul’s video met with almost universal dismay. He removed the video, tearfully apologized, and now faces the prospect of the end of his career. YouTube announced that Paul no longer would receive “Google Preferred” advertising revenue status, nor would he appear in a movie and other original content planned by the platform.

Paul’s story is exceptional, but for many people it underlined the open questions we have about privacy in the digital age. At a time when everyone seems equipped with smartphones capable of high-definition video capture, the boundaries between public and private in the physical world have blurred. Many teenagers are inspired by the success of Paul and other YouTube vloggers (including Paul’s younger brother Jake), hoping that they, too, can become rich and famous, needing no more than their smartphones or a GoPro to generate an audience.

These are fantasies that YouTube and other platforms depending on such “user-generated content” have strong financial incentive to support. Yet in seeking (or in Paul’s case, retaining) their audiences, YouTubers must constantly provide new content, some of which can push the boundaries of good taste or the law.

Paul’s antics violated basic norms of decency, and might have violated the law. In the United States, there are long-standing laws that allow victims to sue people who invade their privacy or offensively publish private facts about them. Most of these laws date to the early 20th century, and were a response to the spread of old-fashioned analog Kodak cameras and the mass media. But our law has failed to evolve to protect against new uses of the new, ubiquitous, digital cameras. One sorely needed reform is a general protection against the non-consensual distribution of intimate images, commonly referred to as “revenge porn.”

YouTube, though, is largely immune from liability, even as it puts incentives on the Pauls of the world to push the envelope. A federal law, Section 230 of the Communications Decency Act, largely immunizes platforms such as YouTube from illegal content posted by its users. The legal document that has the greatest constraint on outrageous behavior by vloggers is YouTube’s Terms of Service, a contract that allows the platform to block offensive content if it chooses.

This illustrates the immature level of our laws in this area — the only real constraints on offensive behavior on social media platforms are the dense legal policies of the companies themselves and the fickle market of customer attention and clicks.

We need new laws in this area, to ensure the relationships between companies and the public can be built on a foundation of trust — laws that respect the people who use these services as consumers and citizens. This is not to say that we should act as heavy-handed censors; such an approach would be unwise and unconstitutional, and social media companies already employ thousands of censors called “content moderators.”

What we need instead is to shine the light of publicity on the business practices of social media companies, to better understand how our ability to express ourselves in the digital age is mediated by a few powerful companies whose actions are largely unchecked by law.

We should try to develop a kind of consumer protection law for the digital age — baseline rules designed to protect people from offensive invasions of privacy and the mostly unchecked power of social media companies. We need laws for the free expression issues of the information age.

Silicon Valley long has touted the virtues of “disruptive innovation.” In their adolescence, social media companies are facing the consequences of some of their disruption — consequences that we are facing with them. It is time to produce reasonable legal rules that nudge these platforms into responsible adulthood.

Neil Richards is the Thomas and Karole Green Professor of Law at Washington University, where he directs the Institute for Policy in Medicine and Law. An internationally recognized expert in privacy law, information law and freedom of expression, he also is an affiliate scholar with the Stanford Center for Internet and Society and the Yale Information Society Project, and a fellow at the Center for Democracy and Technology. Follow him on Twitter @neilmrichards.