Risk prediction in criminal justice in the united states: the propublica-compas case

Special report: “Ethics of AI” field research
AI ethics in action
By Valérie Beaudouin, Winston Maxwell
English

An article published by the independent non-profit news media Pro Publica in 2016 argued that Compas software, used in the United States to predict recidivism, was ‘biased against blacks’. The publication sent shockwaves through the public sphere, fuelling broad debate on the fairness of algorithms and the merits of risk prediction tools – debates that had previously been limited to specialists in criminal justice. Starting with the ProPublica-Compas case, we explore the various facets of this controversy, both in the world of data science and in the world of criminal justice. In the media sphere, the Compas affair brought to the surface the potential abuses associated with algorithms, and it intensified concerns surrounding artificial intelligence (fear of AI replacing human judgment, worsening of inequalities and opacity). In the academic world, the subject was pursued in two separate arenas. First, in the arena of data sciences, researchers focused on two issues: fairness criteria and their mutual incompatibility, showing just how problematic it is to translate a moral principle into statistical indicators; and the supposed superiority of machines over humans in prediction tasks. In the criminal justice arena, which is much more heterogeneous, the ProPublica-Compas case strengthened the realization that it is necessary to evaluate predictive tools more thoroughly before using them, and to understand how judges use these tools in context, causing lawmakers and NGOs defending prisoners’ rights to modify their viewpoint on the matter. While the data science arena is relatively self-contained, focusing on data and algorithms out of their operational context, the criminal justice arena, which brings together heterogeneous actors, focuses on the tools’ actual usage in the criminal justice process.

  • risk assessment
  • predictive algorithms
  • criminal justice
  • algorithms
  • fairness
  • controversy
  • algorithm evaluation
Go to the article on Cairn-int.info