Judgment by algorithm

  • Print
Listen to this story

Subscriber Benefit

As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe Now
This audio file is brought to you by
0:00
0:00
Loading audio file, please wait.
  • 0.25
  • 0.50
  • 0.75
  • 1.00
  • 1.25
  • 1.50
  • 1.75
  • 2.00
focus-algorithm-shutterstock-459015925-2col.jpg

Eric Loomis just wants to know how, exactly, an algorithm determined he posed a high risk to the community.

The Wisconsin man was charged in connection with a February 2013 drive-by shooting in La Crosse. He has continued to maintain he was home cooking dinner at the time of the incident, but he did plead guilty to two related offenses of trying to flee police and driving a motor vehicle without the owner’s permission.

Sentencing him to six years of incarceration and five years of extended supervision, the trial judge pointed to Loomis’ history of recidivism and failure to take responsibility. Also, the court noted the result of an actuarial assessment tool that labeled him likelier to reoffend.

Loomis filed a motion for post-conviction relief, arguing, in part, that his due process rights were violated because the court relied on the computer-based assessment program COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). A central question is whether the claims by COMPAS developer Northpointe Inc. — that the calculations were proprietary — prevented Loomis from challenging the scientific validity of the assessment.

Now the Supreme Court of the United States is considering taking a closer look. Loomis filed a petition for writ of certiorari in October 2016 after the Wisconsin Supreme Court denied his motion for resentencing. In March 2017, the court invited the acting solicitor general to file a brief on the case.

Risk assessments and algorithms are not new. They have been used routinely by criminal justice systems in many states to predict how likely the defendant or offender is to commit another crime.

Indiana has a set of such tools to help determine conditions for pretrial release, community supervision, prison intake, and re-entry. The Indiana Risk Assessment System scores an individual’s responses to a series of questions that range from criminal history and substance abuse to employment and social support.

Like the Wisconsin tool, the IRAS scores are based on algorithms.

Both the Indiana Public Defender Council and the Indiana Prosecuting Attorneys Council expressed reservations about IRAS. The organizations note IRAS scores have not been validated as to their accuracy

when applied to the Indiana offenders. Moreover, they worry judges will rely more on the numerical score than their own discretion.

IRAS scores are actuarial and not individualized. The assessment tools put the defendants or offenders in certain groups based on shared characteristics. The IRAS-Pretrial Assessment Tool labels an individual as low-, medium- or high-risk, depending on the typical behavior of the group.

Larry Landis, executive director of the Indiana Public Defender Council, compared the categorization to auto insurance companies charging higher premiums for teenage male drivers because, as a group, they are involved in more car accidents.

“Maybe for insurance it’s OK, but we’re talking about liberty when using this tool to predict human behavior,” Landis said. “Nobody’s good at that.”

Human element still required

Edward Latessa, director and professor of the School of Criminal Justice at the University of Cincinnati, led the design of the IRAS. He based the tools on the Ohio Risk Assessment System he developed in 2006.

Latessa and his research team interviewed 1,800 offenders in Indiana to determine the factors that indicated a risk of additional criminal acts. The researchers are now preparing to do a validity study of IRAS using a database of information collected from 350,000 offenders.

Currently, 11 counties in Indiana are testing the IRAS Pretrial Assessment Tool as part of a pilot program to reduce the imposition of bail. Courts in the pilot counties are using the IRAS-PAT designed to help determine the conditions of release without cash or bond. The tool asks seven questions to rank the likelihood of the individual failing to appear in court and committing another offense.

Latessa maintained the IRAS score is based on information courts typically gather piecemeal. When using the Indiana tools, the interviewers, such as probation officers, still must use their personal expertise in evaluating the individuals.

Noting risk assessment tools are not going anywhere, David Powell, executive director of the Indiana Prosecuting Attorneys Council, said they must provide a valid, accountable, objective analysis and the outcomes have to be measured.

Algorithms cannot replace personal judgment, he said, giving the example that IRAS does not consider the nature of the crime. A person charged with a heinous act might be deemed low-risk if it was the first crime he or she committed. Prosecutors who turn the decision-making over to the victims have been disciplined for transfer of discretion, and Powell sees the same possibility for running afoul of court rules when judges rely too much on IRAS.

“Prosecutors are not big fans of IRAS, by and large,” he said. “The fear we have is the tool will be used in lieu of judicial discretion.”

Allen Superior Judge John Surbeck echoed Latessa by emphasizing the IRAS scores are simply additional information for the bench to review and should not be given more weight.

He acknowledged the concern, even from judges, that the assessment tools will replace judicial discretion, but he still believes it will enhance decision-making. Judges will continue to enlist their expertise when considering a defendant’s criminal history and nature of the current offense. Furthermore, they will have guidance from the state’s sentencing statute, which clearly identifies the mitigating and aggravating factors to be considered.

“It’s possible (a judgment) could be reduced to a numerical but most of my colleagues, I think, are truly dedicated to doing what they’re supposed to do,” Surbeck said. “It would be tempting but I trust my colleagues to do better than that.”

Let it percolate

News reports, including a series done by ProPublica in 2016, have focused on the potential for bias against minorities built into these kinds of assessment tools. Landis described particular questions in the IRAS, which he said ask the age of the offender’s first arrest and whether the offender has any relatives in prison, as being racially biased.

Latessa disputed the allegations. He pointed to two 2016 academic studies led by University of California researchers that repudiated assessments are skewed toward certain groups. Also, he noted relying on intuition or personal judgment alone can result in biased decisions.

In its respondent brief, Wisconsin advised the Supreme Court to let the risk assessment algorithms percolate longer in the lower courts since, to date, the question has been addressed by only one other state — Indiana.

The case, Malenchik v. Indiana, 928 N.E.2d 564 (Ind. 2010), considered Tippecanoe Superior Court’s use of the results from risk evaluation and assessment instruments when sentencing the defendant, Dan Malenchik.

Writing for the court, Justice Brent Dickson explained that neither assessment tool the trial court used “are intended nor recommended to substitute for the judicial function of determining the length of sentence appropriate for each offender. But such evidence-based assessment instruments can be significant sources of valuable information for judicial consideration.”•

Please enable JavaScript to view this content.

{{ articles_remaining }}
Free {{ article_text }} Remaining
{{ articles_remaining }}
Free {{ article_text }} Remaining Article limit resets on
{{ count_down }}