Back to articles
Articles
Volume: 6 | Article ID: art00039
Image
Feature Based No-Reference Continuous Video Quality Prediction Model for Coded Stereo Video
  DOI :  10.2352/CGIV.2012.6.1.art00039  Published OnlineJanuary 2012
Abstract

In this paper, we propose a continuous no-reference video quality evaluation model for MPEG-2 MP@ML coded stereoscopic video based on spatial, temporal, and disparity features with the incorporation of human visual system characteristics. We believe edge distortion is a major concern to perceive spatial distortion throughout any image frame which is strongly dependent on smooth and non-smooth areas of the frame. We also claim that perceived depth of any image/video is mainly dependent on central objects/structures of the image/video contents. Thus, visibility of depth is firmly dependent on the objects' distance such as near, far, and very far. Subsequently, temporal perception is mostly based on jerkiness of video and it is dependent on motion as well as scene content of the video. Therefore, segmented local features such as smooth and non-smooth area based edge distortion, and the objects' distance based depth measures are evaluated in this method. Subsequently, video jerkiness is estimated based on segmented temporal information. Different weighting factors are then applied for the different edge distortion and depth features to measure the overall features of a temporal segment. All features are calculated separately for each temporal segment in this method. Subjective stereo video database, which considered both symmetric and asymmetric coded videos, is used to verify the performance of the model. The result indicates that our proposed model has sufficient prediction performance.

Subject Areas :
Views 6
Downloads 0
 articleview.views 6
 articleview.downloads 0
  Cite this article 

Z. M. Parvez Sazzad, Rafik Bensalma, Mohamed Chaker Larabi, "Feature Based No-Reference Continuous Video Quality Prediction Model for Coded Stereo Videoin Proc. IS&T CGIV 2012 6th European Conf. on Colour in Graphics, Imaging, and Vision,  2012,  pp 217 - 225,  https://doi.org/10.2352/CGIV.2012.6.1.art00039

 Copy citation
  Copyright statement 
Copyright © Society for Imaging Science and Technology 2012
72010351
Conference on Colour in Graphics, Imaging, and Vision
conf colour graph imag vis
2158-6330
Society of Imaging Science and Technology
7003 Kilworth Lane, Springfield, VA 22151, USA