TikTok announces new content rules against misinformation, terrorist activity
The hugely popular social media app TikTok on Wednesday tightened its rules around what kind of content it permits, clarifying that it will take down videos that contain misinformation, promote terrorism or incite hatred against minorities.
The Chinese-owned app, which revolves around short-form videos set to music, is overhauling its previously sparse community guidelines as it continues to face scrutiny over its ties to Beijing and questions about whether it censors content according to the Chinese government’s sensibilities.
In a blog post, the company said the guidelines are intended to foster a “rewarding and fun” community that can also grapple with “serious or controversial content.”
TikTok, which burst into Western markets over the last few years, is widely known for its never-ending stream of wacky videos and very young user base. In 2019, TikTok quickly became one of the most downloaded apps on both Apple and Google, surpassing 1.5 billion downloads and edging out popular American social media apps such as Instagram.
The company says it has lagged behind other social media companies in areas such as content moderation and community standards as it has gathered hundreds of millions of users. Now, it is seeking to prove its maturity as it formalizes 10 categories of videos that it will not allow on its burgeoning platform.
Though TikTok says it tailors its content moderation policies to each region that it operates in, the guidelines released Wednesday are intended to form the “basis” for all of its policies.
“Today, we’re releasing a comprehensive, expanded publication of the Community Guidelines that help maintain a supportive and welcoming environment on TikTok,” wrote Lavanya Mahendran and Nasser Alsherif, who work on TikTok’s global trust and safety team.
The categories of videos that TikTok will not allow include those that promote terrorist ideologies, encourage criminal behavior, depict gratuitous violence, glorify self-harm and encourage hate speech.
The platform is seeking to differentiate itself from Facebook, its top competitor, with a wide-ranging policy banning misinformation that “could cause harm to our community or the larger public.” Facebook has long faced criticism for a narrower policy that allows politicians to lie and requires dubious claims to run through a third-party fact-checking service.
Because TikTok users are mostly under 18, its new guidelines have a special section dedicated to keeping minors safe. The company says it will not tolerate any content that “depicts or disseminates child abuse, child nudity or sexual exploitation of children” or any videos that show minors engaged in “delinquent behavior” such as consuming alcohol or drugs. The guidelines specifically ban adults from “grooming” children or building an emotional relationship with them through the TikTok app.
“These guidelines reflect our driving philosophy — providing a platform for creative self-expression while remaining safe, diverse, and authentic — and define a common code of conduct on our platform,” Mahendran and Alsherif wrote.
The platform will still likely face enormous scrutiny from policymakers and regulators who say the app poses a national security threat because it is owned by Chinese media conglomerate ByteDance. Private industry is deeply interwoven with the government in China, and lawmakers have raised concerns that TikTok could be forced to share data on young users with the Chinese Communist Party.
TikTok insists that it operates completely separately from its Beijing parent and has pledged to continue building out its U.S. roots over the upcoming year.
The Hill has removed its comment section, as there are many other forums for readers to participate in the conversation. We invite you to join the discussion on Facebook and Twitter.