Illustration of a person behind bars, a scale of justice, and a computer, depicting a case study about COMPAS and criminal justice bias.

🧭 Case Study 1: COMPAS and Criminal Justice Bias

An ethical examination of algorithmic risk scoring in the U.S. judicial system

πŸ“Œ Background

COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is a proprietary algorithm used to assess the likelihood that a defendant will reoffend. It is commonly used in sentencing, bail decisions, and parole evaluations.

βš–οΈ Ethical Concerns

  • Bias and Discrimination: Independent studies, such as ProPublica’s 2016 investigation, revealed that COMPAS disproportionately flagged Black defendants as high risk more often than white defendants, even when the white defendants had higher actual recidivism rates.
  • Transparency and Accountability: COMPAS is a closed source system. Neither defendants nor their attorneys can examine how risk scores are calculated. Judges often rely on the score without understanding the underlying logic.
  • Due Process Violation: Defendants are assessed by an algorithm they cannot challenge, undermining principles of legal defense and procedural fairness. In State v. Loomis, the Wisconsin Supreme Court upheld COMPAS but required a disclaimer regarding its limitations.

πŸ“Š By the Numbers

  • Black defendants were 45% more likely to be misclassified as high risk, but did not reoffend.
  • White defendants were 23% more likely to be misclassified as low risk, but did reoffend.
  • Data based on a ProPublica review of 7,000+ risk scores in Broward County, Florida. β†’ View Source

🧠 Ethical Reflection

The use of COMPAS highlights the urgent need for algorithmic transparency and oversight in systems where liberty is on the line. Bias that once lived in human discretion now lives in code faster, hidden, and harder to challenge.

When opaque systems shape liberty and no one can question their logic, we do not just risk injustice, we automate it.

This case demands we rethink the balance between data and due process.

πŸ› οΈ Clause Tie-In (from Doctrine)

Clause 6.2: Transparency in Automated Legal Systems

Any AI system used in criminal justice must be open to independent audit and challenge. Defendants must have access to the logic behind their algorithmic risk scores, and human judgment must retain authority over machine recommendations.

πŸ“Ž Related Resources

πŸ”— ← Back to Case Study Library