Safety in Numbers?
The Atlantic profiled the work of University of Pennsylvania statistician Richard Berk last month.
Berk uses statistical prediction models to conduct risk assessment of criminal offenders.
Berk’s expertise is being sought at nearly every stage of the criminal-justice process. Maryland is running an algorithm like Philadelphia’s that predicts who under supervision will kill—or be killed. The state has asked Berk to develop a similar algorithm for juveniles. He is also mining data from the Occupational Safety and Health Administration to forecast which businesses nationwide are most likely to be breaking OSHA rules. Back in Philadelphia, he is introducing statistics to the district attorney’s office, helping prosecutors decide which charges to pursue and whether to ask for bail. He may also work with the Pennsylvania sentencing commission to help determine whether and how long to incarcerate those convicted of crimes.
Data-based decisions are often less flawed than decisions based on human judgment alone–even (and in some cases, especially) where the decision maker is an experienced professional. We blogged about this awhile back in “Finding the Fat Catchers of Criminal Justice.” There, we argued that data collection and evaluation are essential to properly implement criminal justice reforms and avoid misleading results.
But decisions based upon past behavior alone have the troubling potential to perpetuate those past behaviors. University of Chicago Law Professor Bernard Harcourt explains:
“When you live in a world in which juveniles are much more likely to be stopped—or, if stopped, be arrested, or, if arrested, be adjudicated—if they are black, then all of the indicators associated with prior criminal history are going to be serving effectively as a proxy for race,” said Bernard Harcourt, a law and political-science professor at the University of Chicago, who wrote Against Prediction: Profiling, Policing, and Punishing in an Actuarial Age. By using prior record to predict dangerousness, he insisted, “you just inscribe the racial discrimination you have today into the future.”
This is a risk to be reckoned with. But it is no greater concern than the risk for bias inherent in more discretionary decision-making.