Bias In, Bias Out.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Abstract:
      Police, prosecutors, judges, and other criminal justice actors increasingly use algorithmic risk assessment to estimate the likelihood that a person will commit future crime. As many scholars have noted, these algorithms tend to have disparate racial impacts. In response, critics advocate three strategies of resistance: (1) the exclusion of input factors that correlate closely with race; (2) adjustments to algorithmic design to equalize predictions across racial lines; and (3) rejection of algorithmic methods altogether. This Article's central claim is that these strategies are at best superficial and at worst counterproductive because the source of racial inequality in risk assessment lies neither in the input data, nor in a particular algorithm, nor in algorithmic methodology per se. The deep problem is the nature of prediction itself. All prediction looks to the past to make guesses about future events. In a racially stratified world, any method of prediction will project the inequalities of the past into the future. This is as true of the subjective prediction that has long pervaded criminal justice as it is of the algorithmic tools now replacing it. Algorithmic risk assessment has revealed the inequality inherent in all prediction, forcing us to confront a problem much larger than the challenges of a new technology. Algorithms, in short, shed new light on an old problem. Ultimately, the Article contends, redressing racial disparity in prediction will require more fundamental changes in the way the criminal justice systemconceives of and responds to risk. The Article argues that criminal law and policy should, first, more clearly delineate the risks that matter and, second, acknowledge that some kinds of risk may be beyond our ability to measure without racial distortion--in which case they cannot justify state coercion. Further, to the extent that we can reliably assess risk, criminal system actors should strive whenever possible to respond to risk with support rather than restraint. Counterintuitively, algorithmic risk assessment could be a valuable tool in a system that supports the risky. [ABSTRACT FROM AUTHOR]
    • Abstract:
      Copyright of Yale Law Journal is the property of Yale Law Journal and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
    • Author Affiliations:
      1Assistant Professor of Law, University of Georgia School of Law
    • ISSN:
    • Accession Number:
  • Citations
    • ABNT:
      MAYSON, S. G. Bias In, Bias Out. Yale Law Journal, [s. l.], v. 128, n. 8, p. 2218–2300, 2019. Disponível em: Acesso em: 15 ago. 2020.
    • AMA:
      MAYSON SG. Bias In, Bias Out. Yale Law Journal. 2019;128(8):2218-2300. Accessed August 15, 2020.
    • APA:
      MAYSON, S. G. (2019). Bias In, Bias Out. Yale Law Journal, 128(8), 2218–2300.
    • Chicago/Turabian: Author-Date:
      MAYSON, SANDRA G. 2019. “Bias In, Bias Out.” Yale Law Journal 128 (8): 2218–2300.
    • Harvard:
      MAYSON, S. G. (2019) ‘Bias In, Bias Out’, Yale Law Journal, 128(8), pp. 2218–2300. Available at: (Accessed: 15 August 2020).
    • Harvard: Australian:
      MAYSON, SG 2019, ‘Bias In, Bias Out’, Yale Law Journal, vol. 128, no. 8, pp. 2218–2300, viewed 15 August 2020, .
    • MLA:
      MAYSON, SANDRA G. “Bias In, Bias Out.” Yale Law Journal, vol. 128, no. 8, June 2019, pp. 2218–2300. EBSCOhost,
    • Chicago/Turabian: Humanities:
      MAYSON, SANDRA G. “Bias In, Bias Out.” Yale Law Journal 128, no. 8 (June 2019): 2218–2300.
    • Vancouver/ICMJE:
      MAYSON SG. Bias In, Bias Out. Yale Law Journal [Internet]. 2019 Jun [cited 2020 Aug 15];128(8):2218–300. Available from: