back to notes

Pre-trial risk assessments

Algorithmic decisionmaking tools are only as smart as the inputs to the system. Many algorithms effectively only report out correlations found in the data that was used to train the algorithm.

Algorithms being applied nationwide are widely varied in design, complexity, and inputs, including cutting-edge techniques like machine
learning. Machine learning is the process by which rules are developed from observations of patterns in the training data. As a result, biases in data sets will not only be replicated in the results, they may actually be exacerbated.

For example, since police officers disproportionately arrest people of color, criminal justice data used for training the tools will perpetuate this correlation.
Thus, automated predictions based on such data – although they may seem objective or neutral – threaten to further intensify unwarranted discrepancies in the justice system and to provide a misleading and undeserved imprimatur of impartiality for an institution that desperately needs fundamental change.

http://civilrightsdocs.info/pdf/criminal-justice/Pretrial-Risk-Assessment-Full.pdf


last updated november 2018