Required Number of Quantization Bits for CIE XYZ Signals Applied to Various Transforms in Digital Cinema Systems


IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E90-A    No.5    pp.1072-1084
Publication Date: 2007/05/01
Online ISSN: 1745-1337
DOI: 10.1093/ietfec/e90-a.5.1072
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Image
digital cinema,  quantization,  color difference,  XYZ signal,  CIE 1976 L*a*b*,  

Full Text: PDF(972.4KB)>>
Buy this Article

To keep in step with the rapid progress of high quality imaging systems, the Digital Cinema Initiative (DCI) has been issuing digital cinema standards that cover all processes from production to distribution and display. Various evaluation measurements are used in the assessment of image quality, and, of these, the required number of quantization bits is one of the most important factors in realizing the very high quality images needed for cinema. While DCI defined 12 bits for the bit depth by applying Barten's model to just the luminance signal, actual cinema applications use color signals, so we can say that this value has an insufficient theoretical basis. This paper, first of all, investigates the required number of quantization bits by computer simulations in discrete 3-D space for the color images defined using CIE's XYZ signal. Next, the required number of quantization bits is formulated by applying Taylor's development in the continuous value region. As a result, we show that 13.04 bits, 11.38 bits, and 10.16 bits are necessary for intensity, density, and gamma-corrected signal quantization, respectively, for digital cinema applications. As these results coincide with those from calculations in the discrete value region, the proposed analysis method enables a drastic reduction in the computer simulation time needed for obtaining the required number of quantization bits for color signals.