Correctional Offender Management Profiling for Alternative Sanctions (COMPAS)[1] is a case management and decision support tool developed and owned by Northpointe (now Equivant) used by U.S. courts to assess the likelihood of a defendant becoming a recidivist.
[2][3] COMPAS has been used by the U.S. states of New York, Wisconsin, California, Florida's Broward County, and other jurisdictions.
[4] The COMPAS software uses an algorithm to assess potential recidivism risk.
Northpointe created risk scales for general and violent recidivism, and for pretrial misconduct.
According to the COMPAS Practitioner's Guide, the scales were designed using behavioral and psychological constructs "of very high relevance to recidivism and criminal careers.
, is "determined by the strength of the item's relationship to person offense recidivism that we observed in our study data.
"[4] A general critique of the use of proprietary software such as COMPAS is that since the algorithms it uses are trade secrets, they cannot be examined by the public and affected parties which may be a violation of due process.
[13] Specifically, COMPAS risk assessments have been argued to violate 14th Amendment Equal Protection rights on the basis of race, since the algorithms are argued to be racially discriminatory, to result in disparate treatment, and to not be narrowly tailored.
[15] The team found that "blacks are almost twice as likely as whites to be labeled a higher risk but not actually re-offend," whereas COMPAS "makes the opposite mistake among whites: They are much more likely than blacks to be labeled lower-risk but go on to commit other crimes.
"[15][10][16] They also found that only 20 percent of people predicted to commit violent crimes actually went on to do so.
[15] In a letter, Northpointe criticized ProPublica's methodology and stated that: "[The company] does not agree that the results of your analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.
[17] Among several objections, the CRJ rebuttal concluded that the Propublica's results: "contradict several comprehensive existing studies concluding that actuarial risk can be predicted free of racial and/or gender bias.
[10] Researchers from the University of Houston found that COMPAS does not conform to group fairness criteria and produces various kinds of unfair outcomes across sex- and race-based demographic groups.