Let's enact a privacy law that advances economic justice

Let's enact a privacy law that advances economic justice
© Getty Images

Two U.S. senators recently threw down the privacy gauntlet, proposing bills to give Americans control over their personal data. Sen. Maria CantwellMaria Elaine CantwellMcConnell sets Friday night deadline for bipartisan deal on stimulus Washington state lawmakers warn health workers running low on protective gear Carper staffer tests positive in Delaware MORE (D-Wash.) introduced a bill containing the Democrats’ privacy wish list, while Sen. Roger WickerRoger Frederick WickerStimulus empowers Treasury to rescue airlines with billion in direct assistance White House, Senate reach deal on trillion stimulus package McConnell sets Friday night deadline for bipartisan deal on stimulus MORE (R-Miss.), the Commerce Committee chairman, countered with an industry-backed Republican proposal.

Both bills, along with other proposals on the table, are an improvement over the largely unregulated big data ecosystem in which we currently reside. However, these bills miss the chance to include provisions that would help low-income Americans gain economic mobility in the face of digital profiling that can trap them in poverty.  

At all hours of the day, and deep into the night, our data is being harvested, aggregated and sold.  Businesses generate immense profits from this data mining, making use of our buying habits, social relationships, political preferences, lifestyle, health and personality.  

ADVERTISEMENT

This is why we are followed around the internet with relentless ads for items we may have only briefly perused. However, for low-income people, the consequences of digital profiling can be much more dire.  

Their digital dossiers can limit their ability to rent a home, get a car loan, gain college admission, access health care, or be hired for a job. This is because data profiles are rife with erroneous data and filled with dubious inferences drawn from individual data as well as social networks. At the same time, the algorithms that fuel automated decision-making can be discriminatory.

Even though automated decision-making is mostly invisible to consumers, evidence of these problems is mounting. For example, Facebook recently settled a lawsuit alleging that it allowed advertisers to micro-target people for housing, loans and help-wanted ads based on race, gender, age, “ethnic affinity” and other legally protected attributes.  

A federal court in Connecticut has allowed a lawsuit to go forward against a tenant-screening company that rejects applicants on the basis of criminal records without measures to ensure accuracy or consider mitigating factors. 

Some colleges secretly track high schoolers who visit their websites, and then target potential applicants who look like they can pay full freight, while possibly excluding disadvantaged students from recruiting efforts.

ADVERTISEMENT

The pending privacy laws vary in the tools they bring to these sorts of challenges. Certainly, the Caldwell bill, which limits processing of sensitive data and bans algorithmic discrimination, is more attentive to the differential impacts of technology.

Still, there is more we can do to enhance economic mobility, which not only help individuals but also boosts the economy. A helpful template comes from the General Data Protection Regulation (GDPR), the comprehensive privacy law that has applied since 2018 in the European Union.  

Indeed, the GDPR is part of the impetus for congressional action, especially given that American Big Tech companies already comply with the GDPR for their European consumers — and the sky has not fallen. Here are some provisions that protect our European counterparts with the potential to advance both privacy and economic justice on our shores. (Some of the proposed bills contain some of these measures; none contains them all.) 

First, people should have a right to an explanation when automated decision-making denies them access to life necessities such as jobs and housing. Such explanations could help ferret out erroneous data, incorrect coding and biases.  

Second, people should have ability to recourse to a human decision-maker to challenge automated decisions. Keeping a “human in the loop” enhances transparency and accountability.

Third, people should have the right to delete personal data in the hands of third parties to gain a clean digital slate, free from stigmatization and outdated inferences.  

Fourth, people should have an ongoing voice in the data regimes that govern them. The Cantwell bill would require impact assessments of algorithms to screen for bias. But in the EU, similar impact assessments also include opportunities for stakeholders to share their perspectives.  

Fifth, any new privacy law needs vigorous enforcement, including a private right of action, which the Democrats favor and Republicans oppose. Rights without remedies are meaningless.

Finally, the Cantwell and Wicker bills cover private businesses but the GDPR also regulates government agencies and employers when they collect and process data. Given the importance of the safety net in the lives of low-income people, and the technologically-driven stresses in the low-wage workplace, data privacy laws eventually will need to cover these entities as well.  

Americans need better data privacy protections, but we do not all experience digital technologies in the same way. As Congress considers legislation, it should seize the opportunity to enhance economic opportunity through greater data privacy controls.    

Michele Gilman is the Venable Professor of Law at the University of Baltimore School of Law, where she directs the Civil Advocacy Clinic.  She is also a faculty fellow at Data & Society Research Institute. Follow her on Twitter @profmgilman.