Across the American criminal justice system, the Northpointe corporation’s COMPAS algorithm is one of many that are used to determine the likelihood that a prisoner will commit further crimes and return to prison, known as recidivism. After extensive tests and analysis on the prison statistics of a single county in Florida using a custom set of tools, the journalism foundation ProPublica found that COMPAS disproportionally mis-identitied black prisoners as having higher recidivism likelihoods and white prisoners as having lower ones, affecting sentencing outcomes and treatment by the system. Though Northpointe disputed their results, ProPublica found that the dataset produced for each defendant encodes legacy presumptions about race, class and criminality, even though race itself is not explicitly touched on in the questionnaire. Since recidivism stats are used in sentencing guidelines, the algorithm’s intrinsic biases continue to have concrete impacts on the lives of thousands— everything from prison term lengths to post-release hiring issues and job placement. Quantification and analysis could be used to make the criminal justice system more equitable, distributing more appropriate parole conditions and sentence lengths, but not if the platform has preprogrammed human biases.
This entry is included in Library Stack as part of the house collection Reality Winners.