On a Classification of Voiced/Unvoiced by using SNR for Speech Recognition
- DOI
- 10.2991/icacsei.2013.116How to use a DOI?
- Keywords
- Voiced, Speech production model, White noise, SNR, vocoder, LPC, VAD
- Abstract
As communication medium of information, speech is not only used a lot, but also is the most comfortable. When we have conversation by speech, transmission of the information, which wanted to be delivered, is affected by the noise level. In speech signal processing, speech enhancement is using to improve speech signal corrupted by noise. Usually noise estimation algorithm need flexibility for variable environment and it can only apply on silence region to avoid effects of speech signal. So we have to preprocess finding voiced region before noise estimation. we proposed SNR estimation method for speech signal without silence region. For unvoiced speech signal, vocal track characteristic is reflected by noise, so we can estimate SNR by using spectral distance between spectrum of received signal and estimated vocal track. The proposed estimation method on voiced speech and the method by using voiced/unvoiced region energy are operated with simple logic as time domain method. And the estimation method on unvoiced region is possible to estimated noise level for narrow-band speech signal by using vocal track properties. It can be applied to rate decision of vocoder and used for pre-processing to decide threshold of noise reduction.
- Copyright
- © 2013, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Jongkuk Kim AU - Hernsoo Hahn PY - 2013/08 DA - 2013/08 TI - On a Classification of Voiced/Unvoiced by using SNR for Speech Recognition BT - Proceedings of the 2013 International Conference on Advanced Computer Science and Electronics Information (ICACSEI 2013) PB - Atlantis Press SP - 472 EP - 476 SN - 1951-6851 UR - https://doi.org/10.2991/icacsei.2013.116 DO - 10.2991/icacsei.2013.116 ID - Kim2013/08 ER -