Algorithms Have Built Racial Bias in Legal System-Accept or Not?
- DOI
- 10.2991/assehr.k.220105.224How to use a DOI?
- Keywords
- Algorithm; COMPAS; LSI-R; Racial Bias
- Abstract
Algorithms have been applied in various fields and increased efficiency in most workplaces. Correctional offender Management Profiling for Alternative Sanctions (COMPAS) and Level of Service Inventory-Revised (LSI-R) are the two algorithms that are used to do risk assessment in the legal system. However, both algorithms (COMPAS and LSI-R) have built-in racial bias, and the risk scores (the result of algorithms) are affected by racial bias, which finally affects judges’ decisions because risk score is a factor that can influence defendants’ sentencing and bail. This paper describes the mechanism of COMPAS and LSI-R, analyses the reasons why algorithms have built-in racial bias: developer embed their inherent racial bias into the algorithm, and many static factors that are included into consideration in algorithms can lead to racial bias; why built-in racial bias that appears from algorithm in the legal system is not acceptable; and the four methods of eliminating the racial bias in algorithms and make sure the algorithms are impartial with high efficiency.
- Copyright
- © 2022 The Authors. Published by Atlantis Press SARL.
- Open Access
- This is an open access article under the CC BY-NC license.
Cite this article
TY - CONF AU - Junkai Zhang AU - Yuxuan Han PY - 2022 DA - 2022/01/17 TI - Algorithms Have Built Racial Bias in Legal System-Accept or Not? BT - Proceedings of the 2021 International Conference on Social Development and Media Communication (SDMC 2021) PB - Atlantis Press SP - 1217 EP - 1221 SN - 2352-5398 UR - https://doi.org/10.2991/assehr.k.220105.224 DO - 10.2991/assehr.k.220105.224 ID - Zhang2022 ER -