Beyond the Algorithm: Pretrial Reform, Risk Assessment, and Racial Fairness PUBLICATION by Sarah Picard, Matt Watkins, Michael Rempel, and Ashmini G. Kerodal Beyond the Algorithm: Pretrial Reform, Risk Assessment, and Racial Fairness Beyond the Algorithm: Pretrial Reform, Risk Assessment, and Racial Fairness Risk assessments—automated formulas that measure the “risk” a defendant will be rearrested or fail to appear in court—are among the most controversial issues in criminal justice reform. To proponents, they offer a corrective to potentially biased decisions made by individual judges. To opponents, far from disrupting biases, risk assessments are unintentionally amplifying them, only this time under the guise of science. Drawing on a case study of more than 175,000 defendants in New York City, ‘Beyond the Algorithm: Pretrial Reform, Risk Assessment, and Racial Fairness,’ produced with the support of Arnold Ventures, examines the impact of risk assessment on racial disparities in pretrial decisions. Some key conclusions: business-as-usual approaches to pretrial decision-making often fall short of accurately assessing risk and reducing unnecessary pretrial detention; concerns over risk assessments perpetuating racial disparities are real—even when the assessment tool itself is deemed to be free of bias; even given these concerns, at least in the New York City example, a more targeted use of risk assessments shows the potential for both significantly reducing pretrial detention and alleviating racial disparities. To realize this potential, however, the authors conclude jurisdictions must think "beyond the algorithm"—that is, decide what they want to use a risk assessment for and then work to put in place policies and practices in support of those aims. If the goals are reducing incarceration and promoting racial fairness, then a more targeted use of risk assessments could be of particular benefit to the group that currently experiences the worst pretrial outcomes: defendants of color.