A Frame Work Designing for Deep Fake Motion Detection using Deep Learning in Video Surveillance Systems
- DOI
- 10.2991/978-94-6463-314-6_18How to use a DOI?
- Keywords
- Face Motion; Deep Learning; Deep Fake detection
- Abstract
This exploration centers around constant perception of items in a given setting, which prompts a rundown of the ways of behaving or connections of the things. Because of the absence of time for robotized checking and examination, an administrator should watch a lot of video information in time with complete thoughtfulness regarding recognize any inconsistencies or episodes, or solely after the surprising occurrence has happened may the video information be utilized as proof. To overcome these issues, we have developed a Deep Learning model to detect face objects in real-time to identify their motion for fake detection using Video Surveillance systems. We have also compared our model with the existing models, and we can probably secure high accuracy of 95.4%.
- Copyright
- © 2023 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - Srikanth Bethu AU - M. Ratna Sirisha AU - C. Kothai Andal AU - R. Gayathri AU - H. Chandramouli AU - R. Aruna PY - 2023 DA - 2023/12/21 TI - A Frame Work Designing for Deep Fake Motion Detection using Deep Learning in Video Surveillance Systems BT - Proceedings of the International e-Conference on Advances in Computer Engineering and Communication Systems (ICACECS 2023) PB - Atlantis Press SP - 179 EP - 187 SN - 2589-4900 UR - https://doi.org/10.2991/978-94-6463-314-6_18 DO - 10.2991/978-94-6463-314-6_18 ID - Bethu2023 ER -