The blinding of justice: Technology, journalism and the law

The blinding of justice: Technology, journalism and the law
© Getty Images

The legal profession is in the early stages of a fundamental transformation driven by an entirely new breed of intelligent technologies and it is a perilous place for the profession to be. 

If the needs of the law guide the ways in which the new technologies are put into use they can greatly advance the cause of justice.  If not, the result may well be profits for those who design and sell the technologies but a legal system that is significantly less just.

We have seen this same type of fundamental change play out over the past 20 years in the field of journalism — another arena in which the power of technology and the needs of people in a democracy are deeply intertwined.  Decisions made early in that transition, driven by what the technology could do rather than what journalism should provide, have had a profound impact on the types of information citizens receive.


The lessons learned about mistakes made in the early days of the transformation of journalism provide an exceptional opportunity to understand and shape the future use of technology in the law.

An important element of journalism is called relevance.  Before the web, relevance was provided by editors, and it came in the form of op-ed pages, articles that provided background, long-form pieces that explored other angles of a story and pieces about contrasting points of view.  A reader would learn not only the issue at hand but other information and views relevant to it.

When online journalism went to scale, the technology could not provide relevance, but it could provide similarity, and as a result, similarity became a proxy for relevance.  Start with a set of words, or a document and your favorite search engine will find you others that are like it. 

It seems reasonable.  Give users what they want by providing them content similar to what they have seen before.  What it systematically leaves out, however, is other content that is relevant, and so a key component of journalism has gone largely missing in people’s consumption of news.

No one picked similarity over relevance when the transformation of journalism began, it was just what was at hand.  The technology was available and economic considerations drove it forward.  It is not a leap to say that this has played a large part in the polarization of society.


How does this relate to the law?

We are entering an era of technology that goes well beyond the web.  The law is seeing the emergence of systems based on analytics and cognitive computing in areas that until now have been largely immune to the impact of technology.  These systems can predict, advise, argue and write and they are entering the world of legal reasoning and decision making.

Sites like LegalZoom provide templates for contracts, wills, and articles of incorporation.  Companies such as BlueStar and services like OpenText’s Axcelerate provide text analytics and machine learning in support of intelligent discovery, while OpenText’s Perceptiv can do deeper contract analysis. 

We are seeing the emergence of products such as Lex Machina that can learn from the decision-making history of a court or judge and then predict outcomes or provide insights into the effectiveness of opposing counsel.  Some firms have begun to use analysis of historical information to assess the profitability of cases to determine pricing. 

The Wisconsin Supreme Court recently considered a case involving software used by judges that provides algorithmic predictions of offenders’ recidivism rates to help in sentencing.  The system uses data associated with an individual to provide predictions that include a risk of recidivism assessment.  At the heart of the case was the assertion that the system, which provides information that would appear useful to judges, is completely opaque about how it makes its predictions.

When asked if he thought algorithmic approaches to the law would ever show up in the courtroom, U.S. Supreme Court Chief Justice John Roberts told the New York Times, “It’s a day that’s here and it’s putting a significant strain on how the judiciary goes about doing things.”

Because these modern technologies are based on algorithms that run inside the cold silicon of the machine, it is easy to think that while they may suffer from lack of empathy, they will at least be objective. Unfortunately, while systems built on the foundation of historical data and predictive analytics are powerful, they are also prone to bias and can provide advice that is based on incomplete or imbalanced data.

Just as similarity was used largely without regard to the goals and tenets of journalism, we are now looking at the use of even more powerful technologies that may be deployed in the law without enough consideration of how they could and should be guided by legal forethought by humans.

Without that human dimension we are on a path toward tech-based systems that encourage legal arguments aimed at specific perceived judicial bias.  Sentencing will be guided by predictive systems that skew toward known outcomes rather than approaches that would change them.  Law firms will be guided by models that drift them toward easy wins rather than difficult challenges. 

Even if a technology is close to perfect from an engineering perspective, it might not be the right product.  A system that does a perfect job of predicting recidivism rates, for example, would be less valuable than a system that accepted user input in the form of actions that a court might take such as halfway houses, employment support, and drug rehabilitation programs.  The alternative system could make use of the same core technologies, but would be in tighter alignment with judicial goals.

We are not arguing against the development of such technologies.  The key question is who will guide them.  The transformation of the field is in its early stages.  There is still opportunity to ensure that the best intentions of the law are built into these powerful new systems so that they augment and aid rather than simply replace. This requires the sensible engagement of lawyers, social scientists, engineers and others with both expertise and a deep interest in promoting and preserving justice.

If citizens, leaders of the legal profession, and public officials  do not guide this process, those who have goals that are less aligned with the improvement of the justice system and more aligned with profits will do it for them.  What a shame that would be, especially when the Law has the lessons learned by the convulsions of journalism to show a better way.

Kristian J. Hammond is a professor of Computer Science and Journalism at Northwestern University and chief scientist at Narrative Science, a company focused on automated narrative generation from data.

Daniel B. Rodriquez is the Harold Washington professor at Northwestern University’s Pritzker School of Law where he served as dean from 2012 through August 2018. His principal academic work includes the law-business-technology interface.