The C.O.M.P.A.S. system is a comprehensive algorithm used by some courts within the United States to decide whether or not a criminally convicted person is likely to commit a crime again upon release. The algorithm was a source of great debate as many started to argue it was extremely biased. Northpointe the company that created the system argued that there was racial equality in their system and it didn’t use race as a way to measure someone’s recidivism probability. In a way they weren’t wrong as the error rates for the system were almost equal for both blacks and whites. A black person was much more likely to be labeled as a high risk than a white person. Black people were falsely labeled high risk at a rate of 44% compared to 23% for white people. On the other side white people were more likely to be labeled falsely at a low risk by 47% and black people were labeled falsely at a low risk by 28%. The error rates are close to the same for both races but it’s the type of error that matters. For white people they were more likely to get a false negative and black people were more likely to get a false positive. The C.O.M.P.A.S. is still in use today and many are still working towards getting it barred from aiding in the decision process for keeping people in prison or allowing them out. An algorithm that predicts the recidivism probability for someone is extremely hard to measure how effective it is. For example a let’s say someone who is let out does commit a crime we only receive the data back if they were caught committing a crime. On the other side of it if a person does remain in prison we don’t receiver data on whether or not they would’ve committed a crime. This type of algorithm can be dangerous because we don’t really have an exact way to measure how effective it is.

Injustice Ex Machina: Predictive Algorithms in Criminal Sentencing