A hybrid system for distortion classification and image quality evaluation

Aladine Chetouani, Azeddine Beghdadi, Mohamed Deriche*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

37 Scopus citations

Abstract

Numerous Image Quality Measures (IQMs) have been proposed in the literature with different degrees of success. While some IQMs are more efficient for particular artifacts, they are inefficient for others. The researchers in this field agree that there is no universal IQM which can efficiently estimate image quality across all degradations. In this paper, we overcome this limitation by proposing a new approach based on a degradation classification scheme allowing the selection of the most appropriate IQM for each type of degradation. To achieve this, each degradation type is considered here as a particular class and the problem is then formulated as a pattern recognition task. The classification of different degradations is performed using simple Linear Discriminant Analysis (LDA). The proposed system is developed to cover a very large set of possible degradations commonly found in practical applications. The proposed method is evaluated in terms of recognition accuracy of degradation type and overall image quality assessment with excellent results compared to traditional approaches. An improvement of around 15% (in terms of correlation with subjective measures) is achieved across different databases.

Original languageEnglish
Pages (from-to)948-960
Number of pages13
JournalSignal Processing: Image Communication
Volume27
Issue number9
DOIs
StatePublished - Oct 2012

Keywords

  • Classification
  • Degradations
  • Image quality metrics
  • Linear Discriminant Analysis

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A hybrid system for distortion classification and image quality evaluation'. Together they form a unique fingerprint.

Cite this