An analysis of racial bias in criminal sentencing
Can an algorithm be racist? An examination of the COMPAS algorithm used as an aid in making sentencing and parole decisions. Also featuring a critique of the critique!
logistic regression race politics and law algorithmic bias
Readings and links
- Can you make AI fairer than a judge? Play our courtroom algorithm game
- Machine Bias by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner for ProPublica
- Methodology
- The data itself
- An excellent Jupyter notebook by Jonathan Stray
- A fairness notebook by Jonathan Stray
- The accuracy, fairness, and limits of predicting recidivism, a paper that shows "that although COMPAS uses 137 features to make a prediction, the same predictive accuracy can be achieved with only two features."
- The age of secrecy and unfairness in recidivism prediction, a pseudo-rebuttal of ProPublica's work in that "faulty assumptions about a proprietary algorithm lead to faulty conclusions that go unchecked without careful reverse engineering."
Summary
While we've conducted an analysis of how "the world works," what can these same skills tell us about how someone else's algorithm works?
Notebooks, Assignments, and Walkthroughs
Breaking down machine bias
This notebook explores the ProPublica story Machine Bias. It uses the original data that the reporters collected for the story, through FOIA requests to Broward County, Florida.
Jupyter Notebook
Jupyter Notebook
Jupyter Notebook