Blind Image Quality Assessment Bases On Natural Scene Statistics And Deep Learning
- DOI
- 10.2991/iccsae-15.2016.174How to use a DOI?
- Keywords
- Blind/No-Reference;Natural Scene Statistics(NSS);Deep Belief Network (DBN); Image Quality Assessment(IQA)
- Abstract
Measurement of the image and video quality is crucial for many aspects,such as transmission, compression, perception.The most of traditional methods learning-based image quality assessment(IQA) build the mapping function of the distortion and mass fraction. However,the mapping function is hard to built,and not accurate enough to show the relationship between the linguistic description and numerical number. In this paper,we proposed a new framework to blindly evaluate the quality of an image by learning the regular pattern from natural scene statistics (NSS).Our framework consists of two stages. Firstly,the distortion image is presented by NSS.The Deep Belief Network (DBNs) is used to classify the NSS features to several distortion types. Secondly,a newly qualitative quality pool is proposed according to the distortion types,which converts the distortion types of the image and the degree of the distortion into the numerical scores.In this paper,he proposed distortion classification method is not only more natural than the regression-based,but also more accurate.The experience is conducted on the LIVE image quality assessment database. Extensive studies confirm the effectiveness and robustness of our framework.
- Copyright
- © 2016, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - De Ge AU - Jianxin Song PY - 2016/02 DA - 2016/02 TI - Blind Image Quality Assessment Bases On Natural Scene Statistics And Deep Learning BT - Proceedings of the 2015 5th International Conference on Computer Sciences and Automation Engineering PB - Atlantis Press SP - 939 EP - 945 SN - 2352-538X UR - https://doi.org/10.2991/iccsae-15.2016.174 DO - 10.2991/iccsae-15.2016.174 ID - Ge2016/02 ER -