Four Limits in Probability and Their Roles in Source Coding

Hiroki KOGA  

IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E94-A   No.11   pp.2073-2082
Publication Date: 2011/11/01
Online ISSN: 1745-1337
DOI: 10.1587/transfun.E94.A.2073
Print ISSN: 0916-8508
Type of Manuscript: Special Section PAPER (Special Section on Information Theory and Its Applications)
Category: Source Coding
coding theorem,  information-spectrum methods,  optimistic coding,  strong converse property,  smooth Renyi entropy,  

Full Text: PDF>>
Buy this Article

In information-spectrum methods proposed by Han and Verdu, quantities defined by using the limit superior (or inferior) in probability play crucial roles in many problems in information theory. In this paper, we introduce two nonconventional quantities defined in probabilistic ways. After clarifying basic properties of these quantities, we show that the two quantities have operational meaning in the ε-coding problem of a general source in the ordinary and optimistic senses. The two quantities can be used not only for obtaining variations of the strong converse theorem but also establishing upper and lower bounds on the width of the entropy-spectrum. We also show that the two quantities are expressed in terms of the smooth Renyi entropy of order zero.