New Sentencing Tech May Be As Racist As It Is Impractical

  • Minority report precog
  • It was fourteen years ago that Minority Report hit the theaters and influenced the nation. The film posed a simple, yet interesting question: What if we could predict crimes before they actually happened? It sounded like a radical concept, but these days the possibilities have grown stronger than ever. And one can only hope that Tom Cruise might save us.

    Recently, an algorithm has grown in popularity and is now used all over the country to help judges and parole boards access the likelihood of a person committing additional crimes in the future. It’s called Northpointe, and its results have affected prison sentences, parole chances, and bail amounts without substantial testing on whether or not it actually works long term.

    Rather than leaving critical decisions to people who may be biased, the program aims to minimize human error as much as possible. However, the generalizations, oversights, and misgivings this algorithm is capable of seem to rival even the most incompetent judges and juries.

    There are many problems with using an algorithm to dictate sentences, but the one that sticks out is the undeniable bias it has towards minorities. A recent study by ProPublica proved the program to be an absolute failure in this regard. Only 20% of the 7000 criminals expected to commit a violent crime in the future had done so since their arrest.

    It also takes specific variables and essential details out of the picture. Instead of evaluating each situation on a case by base basis, they analyze superfluous data from thousands of statistics that may not be at all related to the situation at hand.

    When we stop looking at people and start looking at statistics and demographics, we not only limit our perception, we also commit ourselves to ignorance and stunt the growth of civilization. Numbers can tell you a lot, but they cannot speak for people better than they can themselves.