+44 (0)1509 263171
Please use this identifier to cite or link to this item:
|Title: ||Display-dependent preprocessing of depth maps based on just-noticeable depth difference modeling|
|Authors: ||De Silva, Demuni V.S.X.|
Worrall, Stewart T.
|Keywords: ||3D video|
Just-noticeable depth difference (JNDD)
|Issue Date: ||2011|
|Publisher: ||© IEEE|
|Citation: ||DE SILVA, V. ... et al., 2011. Display-dependent preprocessing of depth maps based on just-noticeable depth difference modeling. IEEE Journal of Selected Topics in Signal Processing, 5 (2), pp.335-351.|
|Abstract: ||This paper addresses the sensitivity of human vision to spatial depth variations in a 3-D video scene, seen on a stereoscopic display, based on an experimental derivation of a just noticeable depth difference (JNDD) model. The main target is to exploit the depth perception sensitivity of humans in suppressing the unnecessary spatial depth details, hence reducing the transmission overhead allocated to depth maps. Based on the JNDD model derived, depth map sequences are preprocessed to suppress the depth details that are not perceivable by the viewers and to minimize the rendering artefacts that arise due to optical noise, where the optical noise is triggered by the inaccuracies in the depth estimation process. Theoretical and experimental evidences are provided to illustrate that the proposed depth adaptive preprocessing filter does not alter the 3-D visual quality or the view synthesis quality for free-viewpoint video applications. Experimental results suggest that the bit rate for depth map coding can be reduced up to 78% for the depth maps captured with depth-range cameras and up to 24% for the depth maps estimated with computer vision algorithms, without affecting the 3-D visual quality or the arbitrary view synthesis quality.|
|Description: ||Closed access.|
|Publisher Link: ||http://dx.doi.org/10.1109/JSTSP.2011.2108113|
|Appears in Collections:||Closed Access (Loughborough University London)|
Files associated with this item:
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.