The Constitution prohibits the imposition of unreasonable bail amounts but other than that is silent on the issue. This absence of strict federal guidance has resulted in a situation where bail bonding means different things in different states.
Recently, a growing number of jurisdictions across the country have adopted what are called “bail algorithms” which are touted as a way to ensure everyone pays a fair bail amount, but which have been roundly criticized for their lack of transparency among other things. While they haven’t appeared in Adams County, Broomfield County, Weld County or Denver just yet, their adoption here may not be far off and so it’s an issue worth exploring.
Throwing the Baby Out With the Bail Bonds Bathwater
The process of determining bail amounts had already been somewhat “automated” by the development of widely used “bail schedules”. These lists basically state that if you commit this crime you pay that bail. While they’ve helped streamline the process a bit they’re often seen as lacking subtlety since they ignore things like a person’s income.
In many other cases bail amounts are set by judges. But this too has proved problematic since many complain judges often let their own biases get in the way when setting bail amounts. The net effect, they claim, is that the poor and minorities often don’t get a fair shake.
Enter the Bail Algorithm
The bail algorithm in theory works like this: A number of relevant factors are entered into the algorithm including the person’s age, criminal history, prior bail record and the charges leveled against them. It then uses historical information regarding these factors to determine if a person represents a flight risk. And if so, how significant a risk. Based on its assessment it then recommends a certain bail amount.
Why They’re Problematic
The problems inherent in bail algorithms are many and start with the issue of transparency. Even a flawed decision by a judge is a transparent one. It’s made in open court and becomes part of the public record.
On the other hand you need a degree in computer science and access to the source code to know exactly what went into the creation of the bail algorithm or how it works. Essentially, all of society has to trust that the programmers did the right thing. Functioning societies, however, aren’t built on blind trust. They’re built on transparency and accountability.
What Algorithms Miss
While even bondsman will admit there may be some compelling reasons to revisit the way bail amounts are determined bail algorithms are not the answer. And here’s why:
- First, most bail algorithms don’t take into account crucial factors such as employment status. Psychologists have long known that unemployment puts enormous emotional and psychological stress on people. It can even drive them to commit acts they might never have otherwise considered simply to get money to eat and keep a roof over their head.
- Second, bail algorithms don’t take into account whether a person is wrestling with substance abuse issues. This is not to excuse something like burglary or assault committed by an addict, but to simply acknowledge that addicts almost never commit such acts out of a malice.
- Third, algorithms tend to give far too much weight to the name of the offence and ignore the extenuating circumstances or the motives of the person charged. For example; 2 different instances may both end up with people being charged with assault.
- Finally - and perhaps most crucially - the societal justification for bail algorithms often comes down to allegations of racial bias on behalf of judges. While bail algorithms, it is claimed, are colorblind and draw their conclusions based solely on historical data.
Ignoring such mitigating factors is to ignore the person’s humanity. In addition, unemployment will make it far less likely that a person can come up with any amount of bail. Which means the algorithm is essentially punishing the accused for being poor.
As such if they can obtain effective treatment for their drug or alcohol issues there is an excellent chance they can return to being productive members of society.
One person however, may have clearly been an aggressor while there may be evidence the other person was merely defending himself. The algorithm, however, isn’t capable of taking such nuances into consideration. To it, “assault” is “assault” and that’s all there is to it.
However, if the justice system is awash in racial inequality that would mean the historical data the algorithm is basing its decisions on is deeply flawed. In other words, even if a man was repeatedly targeted by police simply because he is black the algorithm only sees that he’s spent a lot of time in police custody - not that he’s been victimized by racist police tactics - and recommends higher bail or no bail.
Defendants are Humans, Not Machines
Justice may be blind but she is not without compassion. And compassion - roughly defined as an awareness of suffering and a desire to help - is a necessary quality of anyone who sits on the bench. Sometimes that compassion may favor defendants in the form of reduced bail bonds and sometimes it may come down on the side of victims in the form of increased bail for a defendant. As such someone is always going to feel wronged. However, using that as an excuse to remove compassion from the exercise of justice altogether and reduce such an incredibly complex and nuanced pursuit to a mathematical equation seems misguided at best.