A Blog by Jonathan Low

 

Jun 8, 2016

Predictive Risk Algorithms Used In Criminal Sentencing Face Court Fairness Test

Can a program more fairly and accurately predict the potential for future behavior than an experienced human?

And what is due process mean to a machine? JL

Joe Palazzolo reports in the Wall Street Journal:

Risk-evaluation tools have gained in popularity amid efforts around the country to curb the number of repeat offenders. They help authorities sort prisoners, set bail and weigh parole decisions. Criminal justice experts say they are inherently biased, treating poor people as riskier than those who are well off. Proponents say they have elevated sentencing closer to a science.
Algorithms used by authorities to predict the likelihood of criminal conduct are facing a major legal test in Wisconsin.
The state’s highest court is set to rule on whether such algorithms, known as risk assessments, violate due process and discriminate against men when judges rely on them in sentencing. The ruling, which could come any time, would be among the first to speak to the legality of risk assessments as an aid in meting out punishments.
Criminal justice experts skeptical of such tools say they are inherently biased, treating poor people as riskier than those who are well off. Proponents of risk assessments say they have elevated sentencing to something closer to a science.
“Evidence has a better track record for assessing risks and needs than intuition alone,” wrote Christine Remington, an assistant attorney general in Wisconsin, in a legal brief filed in January defending the state’s use of the evaluations.
Risk-evaluation tools have gained in popularity amid efforts around the country to curb the number of repeat offenders. They help authorities sort prisoners, set bail and weigh parole decisions. But their use in sentencing is more controversial.
Before the sentencing of 34-year-old Eric Loomis, whose case is before the state’s high court, Wisconsin authorities evaluated his criminal risk with a widely used tool called COMPAS, or Correctional Offender Management Profiling for Alternative Sanctions, a 137-question test that covers criminal and parole history, age, employment status, social life, education level, community ties, drug use and beliefs.
The assessment includes queries like, “Did a parent figure who raised you ever have a drug or alcohol problem?” and “Do you feel that the things you do are boring or dull?” Scores are generated by comparing an offender’s characteristics to a representative criminal population of the same sex.
Prosecutors said Mr. Loomis was the driver of a car involved in a drive-by shooting in La Crosse, Wis., on Feb. 11, 2013. Mr. Loomis denied any involvement in the shooting, saying he drove the car only after it had occurred.
He pleaded guilty in 2013 to attempting to flee police in a car and operating a vehicle without the owner’s consent and was sentenced to six years in prison and five years of supervision.
“The risk assessment tools that have been utilized suggest that you’re extremely high risk to reoffend,” Judge Scott Horne in La Crosse County said at Mr. Loomis’s sentencing.
Mr. Loomis said in his appeal that Judge Horne’s reliance on COMPAS violated his right to due process, because the company that makes the test, Northpointe, doesn’t reveal how it weighs the answers to arrive at a risk score.
Northpointe General Manager Jeffrey Harmon declined to comment on Mr. Loomis’s case but said algorithms that perform the risk assessments are proprietary. The outcome, he said, is all that is needed to validate the tools.
Northpointe says its studies have shown COMPAS’s recidivism risk score to have an accuracy rate of 68% to 70%. Independent evaluations have produced mixed findings.
Mr. Loomis also challenged COMPAS on the grounds that the evaluation treats men as higher risk than women.
COMPAS compares women only to other women because they “commit violent acts at a much lower rate than men,” wrote Ms. Remington, the state’s lawyer, in her response brief filed earlier this year in the Wisconsin Supreme Court.
Having two scales—one for men and one for women—is good science, not gender bias, she said.
The parties appeared to find common ground on at least one issue.
“A court cannot decide to place a defendant in prison solely because of his score on COMPAS,” Ms. Remington acknowledged, describing it as “one of many factors a court can consider at sentencing.”
Her comments echoed a 2010 ruling by the Indiana Supreme Court holding that risk assessments “do not replace but may inform a trial court’s sentencing determinations.”
Michael Rosenberg, a lawyer for Mr. Loomis, and a spokeswoman for the Wisconsin Attorney General Brad Schimel declined to comment on the case.

0 comments:

Post a Comment