• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Computer algorithms now being used for bail recommendations

Status
Not open for further replies.

kirblar

Member
NYTimes link: http://www.nytimes.com/2015/06/27/us/turning-the-granting-of-bail-into-a-science.html

This is a really, really good thing. Removing people from the process (as much as possible) here helps clear out so many of those biases (race, class, attractiveness) that make rational decision-making here difficult for judges and lawyers.

After two years of testing, the formula, developed at a cost of $1.2 million by the Laura and John Arnold Foundation, is being rolled out to 21 more jurisdictions, including states like Arizona and New Jersey and cities like Chicago and Pittsburgh, the foundation announced on Friday. The algorithm gives defendants two scores — one for their likelihood of committing a crime and one for their risk of failing to appear in court — and flags those with an elevated risk of violence.

In most of the country, there is little science behind the bail decisions made thousands of times a day by magistrates, commissioners and judges. In some places, bail is based on the charges alone; in others, court officials may weigh a host of factors like criminal record, employment status and substance-abuse history. Hidden biases against the poor and minorities can easily creep into the decision-making. And a growing body of evidence indicates that the nation’s bail system keeps many low-risk defendants incarcerated before trial, while those who may pose a higher risk are released because they have the money to make bail.

Many law enforcement groups and defense lawyers have supported the use of scientifically validated risk assessments, but fewer than 10 percent of jurisdictions use them, partly because of cost.

The Arnold Foundation eventually plans to make the tool, called the Public Safety Assessment, available to any jurisdiction.
The Arnold assessment has been met with some skepticism because it does not take into account characteristics that judges and prosecutors normally consider relevant: the defendant’s employment status, community ties or history of drug and alcohol abuse. Instead, after crunching data on one and a half million criminal cases, researchers found that fewer than 10 objective factors — basically age, the criminal record and previous failures to appear in court, with more recent offenses given greater weight — were the best predictors of a defendant’s behavior. Factoring in other considerations did not improve accuracy.

Some initial skeptics, including R. Andrew Murray, the district attorney of Mecklenburg County, N.C., which includes Charlotte, have slowly warmed to the assessment. Charlotte was one of the few jurisdictions in the country that already used a risk assessment tool, but it included a face-to-face interview. The Arnold assessment eliminates the interview.

“I’m expected to do everything I can to keep the public safe,” Mr. Murray said. “If we’re letting more people out earlier in the proceeding, based on more limited information, I’m going to be concerned.”

But, he said, after a yearlong trial, Charlotte’s jail population is down almost 20 percent. Crime has not increased, he said, and many poor defendants have been spared the damaging effects of incarceration, including unemployment and homelessness.
A defendant’s risk scores are given to the judge before bail conditions are set. At times the scores have bolstered prosecutors’ arguments that youthful, baby-faced defendants can be riskier than they appear, Mr. Murray said.

Scott Bales, the chief justice of the Arizona Supreme Court, said the state was expanding use of the assessment from four counties and one city to all 15 counties after judges had clamored for change.

“We heard from judges that defendants were held, pending the resolutions of their charges, for longer than the sentence would have been, and that seemed fundamentally unfair,” Chief Justice Bales said. “They didn’t have information to make an assessment, and were relying on rule-of-thumb or prior practices without really knowing whether those were useful guides or not” when setting bail.

Chief Justice Bales said the assessment tool could combat implicit bias, the invisible set of assumptions based on race, class and other factors that can come into play. Some studies have shown that black defendants are given higher bail amounts than similar white defendants.
 

m3k

Member
At first this sounded horrendous but when humans have been unacceptably biased well, I guess its something
 
Interesting article, thanks for posting. I think the usage of more objective criteria and set standards is a good thing. Some of these judges are just all over the place with their bonds and clearly thinking of things other than likelihood to reappear and likelihood to commit another crime. I would wager that just about any mid to large sized city is dealing with jail over crowding and a way to punch through that would be useful. Though, all it will take is a drunk driver getting out due to a low bond from this and killing someone to make the citizens question it.

As an aside, the highest bond I set as a prosecutor that I can can recall was 4.8m. Large-ish theft case involving a foreign national.
 
Status
Not open for further replies.
Top Bottom