acm-header
Sign In

Communications of the ACM

ACM TechNews

Risk Assessment Algorithms Can Unfairly Impact Court Decisions


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
A judge's gavel.

Pretrial risk assessment algorithms are intended to help judges make more informed decisions. Researchers have raised concerns not only about the fairness and accuracy of the tools themselves, but also their influence on judges thinking.

Credit: Government Technology

A study by the University of Michigan's Ben Green and Harvard University's Yiling Chen suggests pretrial risk assessment algorithms increase the likelihood that judges will change their priorities in making pretrial decisions.

The algorithms use data on previous defendants' outcomes to make forecasts on the given arrestee, presenting them either as a numerical score, or designating them high-, medium-, or low-risk for failure to appear in court, or being arrested again.

The researchers found viewing the algorithms' predictions caused participants to consider factors differently and to more highly prioritize the risk of defendants' failure to appear or getting re-arrested, with the result of more inequitable sentencing (since Blacks were more likely to be deemed higher-risk defendants, and to receive harsher decisions than Whites).

From Government Technology
View Full Article

 

Abstracts Copyright © 2021 SmithBucklin, Washington, DC, USA


 

No entries found