client and attorney shaking hands

Resources, Education, & News Blog

Oh, man. Even the computers are racist?

Recently we wrote about racial bias in plea bargains. For the same crime with no prior convictions, researchers found that white defendants, on average, had better outcomes than their black counterparts. ( You get probation, but you go to jail.)

Along those lines, criminal justice gurus have tried to remove racial bias by taking humans out of the equation. They created computer algorithms to predict which defendants would re-offend. Judges could use that impartial score to give a fair sentence. But it turns out the computers may be just as racially biased – and sometimes more so.

Recidivism algorithms may be deeply flawed

Courts around the country use algorithms (computer programs) that gauge whether a particular defendant is likely to commit future crimes. Those who are predicted to re-offend typically get longer jail or prison time.

The recidivism score is not binding, but judges often rely on the algorithm because it is supposedly impartial. Numbers don’t lie. Numbers aren’t racist. Right? A recent study cited in Science Advances concludes that recidivism algorithms are wrong or random much of the time. They are no more reliable than human judges and prosecutors at predicting whether an individual will stay straight or re-offend. And you see where this is going; when the algorithms guess wrong, it correlates strongly with the person’s race.

The research zeroed in on one widely used algorithm called COMPAS, which has been used in U.S. courts since 2000. The algorithm was somewhat accurate at predicting recidivism. But when it was off, it was way off. The computer program was twice as likely to mislabel as “high risk” black defendants who did not in fact re-offend. Conversely, the algorithm was much more likely to assign “low risk” to white defendants who did go on to commit new crimes.

It’s not the computers that are biased …

Despite attempts by different teams to tweak the algorithm, it spit out the same biased results. Then the watchdog researchers conducted a damning experiment. They asked a random sample of citizens to predict whether a person (a set of data representing a person) would re-offend – and the random group was just as accurate as the sophisticated algorithm.

The problem lies with systemic bias that creates a loop. The data doesn’t reflect who re-offends as much as who gets caught. If law enforcement is more likely to stop a black motorist, or more likely to let a white motorist go with a warning, that shows up in the arrest and conviction rates. Law enforcement then uses that data to decide where to patrol, which affects who they arrest – and who they don’t arrest – and so on. That’s how we end up with computer algorithms that don’t match reality.

Regardless of race or criminal history, or age or sex or any factor, hiring competent legal counsel is the best way to keep the criminal justice system honest.

Source: Popular Science magazine