Local authorities in the U.S. and Europe use predictive governance algorithms to assess people's risk of criminality, and base probation, jail time, and other decisions on such evaluations.
Critics warn this eliminates humans and transparency from decision-making, and developers are not legally bound to explain their programs' mechanisms—while their gender, class, race, or geographical prejudices may be embedded within the algorithms as well.
Predictive algorithms calculate a likelihood of future behavior according to historical data, based on statistical risk determination.
The University of Pennsylvania's Richard Berk, who created algorithms used by Philadelphia's criminal courts to predict recidivism for probation decisions, compared algorithms to automatic pilots. Said Berk, "Automatic pilot is reliable, more reliable than an individual human pilot. The same is going to happen here."
From The New York Times
View Full Article - May Require Paid Subscription
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA
No entries found