Research on Human-Computer Interaction Technology based on Visual User Gesture Recognition
- DOI
- 10.2991/icmeit-19.2019.88How to use a DOI?
- Keywords
- Vision; User gesture recognition; Human-computer interaction; CamShift algorithm.
- Abstract
This paper deeply discusses the general situation, background and application of human-computer interaction, and proposes a new exploration-one-hand gesture recognition in human-computer interaction mode, and discusses the main research techniques in detail and comprehensively. The basic framework of vision-based gesture recognition system is studied. The various principles and methods of vision-based gesture positioning, gesture tracking, gesture segmentation and gesture recognition are analyzed. Based on the CamShift algorithm and the improved CamShift algorithm for gesture tracking, the CamShift algorithm cannot solve the large-area motion interference problem when solving complex dynamic changes. Therefore, it is proposed to add Kalman filter to estimate the next state. It proves that more effective gesture tracking is realized. This paper uses a more reasonable method to achieve the correct sense of input gestures through the visual channel by means of computer vision, digital image processing, pattern recognition and other theories and techniques. The response required to achieve natural human-computer interaction.
- Copyright
- © 2019, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Jianfeng Liao AU - Qun Zhang AU - Jianping You PY - 2019/04 DA - 2019/04 TI - Research on Human-Computer Interaction Technology based on Visual User Gesture Recognition BT - Proceedings of the 3rd International Conference on Mechatronics Engineering and Information Technology (ICMEIT 2019) PB - Atlantis Press SP - 556 EP - 561 SN - 2352-538X UR - https://doi.org/10.2991/icmeit-19.2019.88 DO - 10.2991/icmeit-19.2019.88 ID - Liao2019/04 ER -