On a Relationship between the Correct Probability of Estimation from Correlated Data and Mutual Information

Yasutada OOHAMA  

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E101-A   No.12   pp.2205-2209
Publication Date: 2018/12/01
Online ISSN: 1745-1337
DOI: 10.1587/transfun.E101.A.2205
Type of Manuscript: Special Section LETTER (Special Section on Information Theory and Its Applications)
Category: Shannon theory
Keyword: 
mutual information,  correct probabilty of decoding,  one helper source coding problem,  

Full Text: PDF(305.3KB)
>>Buy this Article


Summary: 
Let X, Y be two correlated discrete random variables. We consider an estimation of X from encoded data φ(Y) of Y by some encoder function φ(Y). We derive an inequality describing a relation of the correct probability of estimation and the mutual information between X and φ(Y). This inequality may be useful for the secure analysis of crypto system when we use the success probability of estimating secret data as a security criterion. It also provides an intuitive meaning of the secrecy exponent in the strong secrecy criterion.