When schools closed down this spring, our kids’ education and social lives shifted online. Tech companies now have more influence than ever before over children’s lives, but too many platforms haven’t been designed with kids’ safety and wellbeing in mind, and the harms are mounting.
Perhaps most disturbing is the surge in reports of online child sexual abuse in recent months. The National Center for Missing and Exploited Children received more than 4 million such reports in April — an increase of nearly 3 million from April 2019. The FBI issued a warning to parents in March, and the Los Angeles Times reported that law enforcement officials in L.A. were “overwhelmed in recent months by a surge in tips about online child sex abuse, with social media platforms and other service providers flagging explicit content and suspicious interactions at an alarming rate.” Some sexual predators are even seizing on the opportunity, circulating a “handbook” on the dark web for how to exploit children online during the pandemic.
Even before the pandemic, multiple studies had linked heavy amounts of time spent on social media with a significant increase in depression and suicidality among young people over the last decade.
Many platforms use the vast amount of data they collect on children, and powerful technologies like artificial intelligence, to deliver the content most likely to keep kids on their devices, regardless of whether that content is appropriate for children. Features like autoplay and infinite scrolling — designed to be addictive — can make it difficult for developing children to get the offline time they need to thrive — especially given today's combination of pandemic-related anxiety, social isolation and skyrocketing time spent online.
Parents overwhelmingly tell us in surveys that they’re concerned about their kids’ social media and game use. And with good reason: Technology isn’t optional right now; the platforms are required by our schools and, for many children, are the only way they can talk to their friends.
We know from years of experience that many tech companies have consistently prioritized profits over our children's wellbeing. In fact, just a few weeks ago, our organizations and others filed a complaint with the Federal Trade Commission documenting TikTok’s repeated failure to comply with COPPA (the Children’s Online Privacy Protection Act).
The problem is, no amount of informed parenting can protect kids from platforms that leave kids exposed to harm.
Just as Congress intervened in the past to protect kids from underage smoking and drinking when the tobacco and alcohol industries failed to do so, the online risks posed to kids by unregulated tech companies require Congress to act.
Three recently introduced bills would go a long way toward protecting kids: the KIDS Act, the Invest in Child Safety Act and a comprehensive amendment to update COPPA. Collectively, these bills ban targeted advertising directed at children, strengthen privacy protection for children and teens, ban autoplay and push alerts targeted toward kids under 16 years old, crack down on violent and otherwise harmful content, and prohibit platforms from recommending unboxing videos.
With experts recommending continued social distancing in the months and years to come, the problem isn’t going away. In a matter of months, our kids have forged a new relationship with technology, and even the children of the most vigilant parents are spending more time than ever online. The question is whether we are going to require that the platforms are engineered with children's safety in mind.
We cannot afford to wait to act until after the pandemic is over, which may take years. Our children need help now from the adults in the room in order to stay safe.