Keep racist algorithms out of our courtrooms

Keep racist algorithms out of our courtrooms

111 signatures
89 signatures until 200

Algorithms written by corporations are taking control of our courtrooms -- and they’re racist.

Across the nation, criminal justice professionals rely on “risk scores” to help them decide on everything from bond amounts to criminal sentencing. Details about a person are plugged into a computer, which spits out a score predicting how likely that person is to commit a crime in the future.

Now it turns out that one of the most widely used risk scoring systems, Northpointe’s COMPAS, is seriously biased against black people.

Criminal justice decisions should not be put in the hands of unaccountable private corporations.

Let’s tell Northpointe to keep its racist risk scores out of our courtrooms.

Northpointe’s software was never meant to be used in courtrooms. The company’s founder, Tim Brennan, wanted it to be used to help reduce crime, not to sentence criminals. “But as time went on,” he has said, “I gradually softened on whether this could be used in the courts or not.” And now it’s used in several states, to inform every stage of the criminal justice system, including sentencing.

Incredibly, there have been very few independent studies of risk scoring software -- and the U.S. Sentencing Commission has never studied the use of risk scores.  But ProPublica has published its own study of Northpointe’s software, and shown that it wrongly labels black defendants as future criminals at almost twice the rate as white defendants.

Yet Northpointe is refusing to open up the calculations it uses to public scrutiny -- so defendants with high risk scores don’t know why they’re being sentenced more harshly.

It’s totally outrageous -- and it needs to be stopped. Will you tell Northpointe to keep its racist software out of the courtrooms, as it originally intended?

Northpointe: don’t sell COMPAS for the sentencing of criminals.

Thanks for all that you do,

Sondhya, Bex and the rest of the SumOfUs team

More information