Artificial Intelligence Face Recognition Technology and Ethical Gender
- DOI
- 10.2991/assehr.k.220701.002How to use a DOI?
- Keywords
- Facial Recognition; Discrimination; Gaydar; Algorithm
- Abstract
In the development and application of artificial intelligence, many outdated and biased data are wrongly expanded in artificial intelligence systems. Many hidden inequalities and discrimination in human society are challenging the security of society and the privacy of people. As technology continues to improve, AI is being used in a variety of fields, such as image diagnosis, prisoner tracking, cell phone screen opening and employee recruitment. In 2017, a Stanford report showed that facial recognition could become gaydar, which caused some panic at the time. Meanwhile, when face recognition started to challenge gender equity, Amazon was an example. In addition to the biases caused by the data mentioned above, the execution and design logic of AI needs to be further explored. The focus of this paper is on biases and how people can avoid them. In addition, this paper will demonstrate how AI technology and algorithms are used in human daily life, with the goal of informing people about the reasoning behind AI facial recognition.
- Copyright
- © 2022 The Authors. Published by Atlantis Press SARL.
- Open Access
- This is an open access article distributed under the CC BY-NC 4.0 license.
Cite this article
TY - CONF AU - Run Wang PY - 2022 DA - 2022/07/04 TI - Artificial Intelligence Face Recognition Technology and Ethical Gender BT - Proceedings of the 2022 International Conference on Science and Technology Ethics and Human Future (STEHF 2022) PB - Atlantis Press SP - 3 EP - 7 SN - 2352-5398 UR - https://doi.org/10.2991/assehr.k.220701.002 DO - 10.2991/assehr.k.220701.002 ID - Wang2022 ER -