Constant term of one-shot channel simulation

Description of constant

The constant term of one-shot channel simulation [HJMR07], [BG14], [LEG18], [Li25] is given as (we use the definition in [Li25])

\[C_{32}=\mathrm{limsup}_{t\to\infty}(\sup_{p_{X,Y}: I(X;Y)=t} \inf_{p_{S|X,Y}: I(X;S)=0}H(Y|S)-t-\log_{2}t),\]

where $H(Y\lvert S)=H(Y,S)-H(S)$ is the conditional entropy (in bits), and $I(X;Y)=H(X)+H(Y)-H(X,Y)$ is the mutual information (in bits). The supremum and infimum are over arbitrary finite discrete joint probability distributions $p_{X,Y}$ with $I(X;Y)=t$ and finite discrete conditional distributions $p_{S\lvert X,Y}$ with $I(X;S)=0$, respectively.

Equivalently, it is the smallest (i.e., infimum of) $\alpha\in\mathbb{R}$ satisfying that there exists $\beta>0$ such that for every jointly-distributed random variables $X,Y$ with finite support, we can construct a random variable $S$ with finite support (jointly-distributed with $X,Y$, possibly after extending the probability space) with $I(X;S)=0$ and $H(Y\lvert S)\le I(X;Y)+\log_{2}(I(X;Y)+\beta)+\alpha$ [LEG18], [Li25].

Known upper bounds

Bound Reference Comments
$<\infty$ [HJMR07], [HJMR10], [BG14]  
$3.871$ [LEG18] Usually reported as 4.
$3.732$ [LA21]  
$3.45$ [FT23]  
$2$ [Li24b], [Li24a]  
$\sum_{k=1}^{\infty}2^{-k-1}k\log_{2}k-\log_{2}\log_{2}e \approx 0.76$ [Li25]  

Known lower bounds

Bound Reference Comments
$>-\infty$ [BG14]  
$-1$ [LEG18]  
$-\log_{2}\log_{2}e\approx-0.53$ [Li25]  

Additional comments

References