TikTok is increasing its parental control options, allowing for more content oversight and giving parents the option to limit who their children can interact with on the hugely popular short-video social media app.
The update centers around privacy and content restrictions, giving parents the option to restrict who can comment on or view their children's TikTok accounts, according to a press release.
TikTok already offers parental controls that limit or turn off direct messaging.
The addition of restrictive content features will allow guardians to filter out precisely what content their child can see or disallow certain content from appearing in search results.
Parents can tailor their preferences to disallow specific search terms, users and hashtags.
As a more automatic option, TikTok now offers a single-button feature that sends the app into "restricted mode," hiding age-inappropriate content and a tool to set screen time limitations.
The extensive parental features are all part of TikTok's previously launched Family Pairing, which allows a minor's account to be linked to an adult's.
TikTok said part of its rollout of additional parental control options would include programs to "strengthen our youth safety and well-being policies."
The company said part of those measures include offering additional guidelines and resources for body positivity and an announcement that TikTok would remove "harmful content like hateful ideologies," the release added.
The company has received criticism that it does not do enough to protect its underage users' information and privacy.
The company was fined $5.7 million by the Federal Trade Commission in 2019 for violating the Children’s Online Privacy Protection Act.
Authorities had accused the company of collecting data from children and broadcasting their location.