Protecting the future of student data privacy: The time to act is now

Protecting the future of student data privacy: The time to act is now
© iStock

Building on the opinion piece by Stephen Balkam (“Who will keep kids safe in an AI world”), we — as parents, policymakers, and caretakers of the next generation — should be asking ourselves whether we’re doing enough to protect our children’s personally identifiable information (“PII”), given the increasing use of vendor software and apps (cumulatively, “Ed-Tech”) in schools.

Consider the following: Summer is over, and the first day after school your high schooler tells you about a new electronic hall pass system that requires input of student PII to leave class, including to use the bathroom. Days later, your elementary schooler presents a form seeking parental consent for use of a gaggle of apps, a third of which are only appropriate for 13 and above (elementary school children are usually under 13). The following week, your middle-schooler tells you about a new college-prep tool he used at school; delving further, you find that the questions included the ethnic, geographic or socio-economic diversity desired of a college, interest in attending a denominational college, etc.

Yes, there are a variety of privacy laws to protect PII, especially that of children: COPPA; the upcoming California Consumer Privacy Act; state privacy laws, etc. It’s also true that many Ed Tech vendors have privacy policies in place. But if you read the fine print, you find that many share student PII with third parties, and within their larger corporate conglomerates, creating growing dossiers about our children, starting in elementary school, through High School. Consider the recent Washington Post article about student data being aggregated and used by colleges when considering admission applications. 

ADVERTISEMENT

In short, what we have is a growing use of technology in schools, coupled with the collection of students’ PII and associated activity (such as browsing history), at a time when the new “oil” is personal data that can be sold, shared or otherwise monetized.

That’s not to say that Ed Tech in schools is bad. From the under-resourced public-school perspective, Ed Tech is a force-multiplier. From the parents’ perspective, exposing kids to technology that enriches their academic experience is important. And from the Ed Tech vendors’ perspective, using data to improve the software and customize learning, while turning a profit, are all fair. 

So, the real question is: How to ensure that Ed Tech vendors are responsible stewards of our children’s data?

Fortunately, we’re at an inflection point when it comes to student PII. Parents are increasingly focused on protecting their children’s PII, and even the federal government is beginning to ask questions. The FTC is currently revisiting COPPA, and in August 2019, Sens. Dick DurbinRichard (Dick) Joseph DurbinSupreme Court poised to hear first major gun case in a decade Protecting the future of student data privacy: The time to act is now Overnight Health Care: Crunch time for Congress on surprise medical bills | CDC confirms 47 vaping-related deaths | Massachusetts passes flavored tobacco, vaping products ban MORE (D-Ill.), Ed MarkeyEdward (Ed) John MarkeyWarren proposes 'Blue New Deal' to protect oceans There's a lot to like about the Senate privacy bill, if it's not watered down Trump administration drops plan to face scan all travelers leaving or entering US MORE (D-Mass.) and Richard Blumenthal (D-Conn.) sent a letter to a large group of Ed Tech companies inquiring about how they handle and protect student data.

Interestingly, the Ed Tech challenge is not dissimilar to one I faced as the Civil Liberties and Privacy Officer for the National Counterterrorism Center — a part of the intelligence community ­— when trying to convince the American public of our good data stewardship after the Edward Snowden leaks.

ADVERTISEMENT

The answer was to implement proactive audits and spot checks, the results of which we made public, earning the praise of the President’s Privacy and Civil Liberties Oversight Board for finding a balance between the need for secrecy and the need to prove to the public that we were adhering to our promises with regard to protecting the data with which we were entrusted.

Arguably, had Facebook had a proactive audit and spot check program in place, it wouldn’t have taken a Cambridge Analytica, and regulators on two continents to figure out what Facebook could have ascertained itself. Likewise, the recent news about undisclosed data sharing between Google and Ascension health systems (“Project Nightingale”) is just the latest example of tech company data sharing run amok.

Notably, the Future for Privacy Forum proposed a voluntary privacy pledge for Ed Tech vendors to protect student PII. To date, over 300 have signed. But there’s no mechanism in place to validate compliance. And if there’s two things I’ve learned from my work in the intelligence community, it’s:

  1. Proactive audits/monitoring are key to ensuring that policies are working as expected (rarely the case, at least initially); and
  2. In order to create trust, you need to be transparent about the results of that auditing/monitoring.

As parents and policymakers responsible for stewarding our children’s digital future, we must be able to trust that Ed Tech is handling student PII appropriately. And the best way to do that, to quote President Ronald Reagan, is “Trust but verify.”

Joel Schwarz is a senior principal at Global Cyber Risk, LLC and an adjunct professor at Albany Law School, teaching courses on cybercrime, cybersecurity and privacy. He previously served as the Civil Liberties and Privacy Officer (CLPO) for the National Counterterrorism Center.