Constant term of one-shot channel simulation
Description of constant
The constant term of one-shot channel simulation [HJMR07], [BG14], [LEG18], [Li25] is given as (we use the definition in [Li25])
\[C_{32}=\mathrm{limsup}_{t\to\infty}(\sup_{p_{X,Y}: I(X;Y)=t} \inf_{p_{S|X,Y}: I(X;S)=0}H(Y|S)-t-\log_{2}t),\]where $H(Y\lvert S)=H(Y,S)-H(S)$ is the conditional entropy (in bits), and $I(X;Y)=H(X)+H(Y)-H(X,Y)$ is the mutual information (in bits). The supremum and infimum are over arbitrary finite discrete joint probability distributions $p_{X,Y}$ with $I(X;Y)=t$ and finite discrete conditional distributions $p_{S\lvert X,Y}$ with $I(X;S)=0$, respectively.
Equivalently, it is the smallest (i.e., infimum of) $\alpha\in\mathbb{R}$ satisfying that there exists $\beta>0$ such that for every jointly-distributed random variables $X,Y$ with finite support, we can construct a random variable $S$ with finite support (jointly-distributed with $X,Y$, possibly after extending the probability space) with $I(X;S)=0$ and $H(Y\lvert S)\le I(X;Y)+\log_{2}(I(X;Y)+\beta)+\alpha$ [LEG18], [Li25].
Known upper bounds
| Bound | Reference | Comments |
|---|---|---|
| $<\infty$ | [HJMR07], [HJMR10], [BG14] | |
| $3.871$ | [LEG18] | Usually reported as 4. |
| $3.732$ | [LA21] | |
| $3.45$ | [FT23] | |
| $2$ | [Li24b], [Li24a] | |
| $\sum_{k=1}^{\infty}2^{-k-1}k\log_{2}k-\log_{2}\log_{2}e \approx 0.76$ | [Li25] |
Known lower bounds
| Bound | Reference | Comments |
|---|---|---|
| $>-\infty$ | [BG14] | |
| $-1$ | [LEG18] | |
| $-\log_{2}\log_{2}e\approx-0.53$ | [Li25] |
Additional comments
-
The bound $-1\le C_{32} <3.871$ was given in [LEG18], where the following result (called “strong functional representation lemma”) was shown: for every (not necessarily discrete) random variables $X,Y$, there exists a (not necessarily discrete) random variable $S$ such that $I(X;S)=H(Y\lvert X,S)=0$ and $H(Y\lvert S)\le I(X;Y)+\log(I(X;Y)+1)+3.871$.
-
It is conjectured that $C_{32}=-\log_{2}\log_{2}e$ [Li25].
-
A bound on $C_{32}$ would have applications in communication complexity [HJMR07],[BG14], distributed channel simulation [FT23],[Li24a], lossy compression [LEG18],[LHB22], rate-distortion-perception trade-off [TW21], and privacy-utility trade-off [ZOS23].
References
- [BG14] Mark Braverman and Ankit Garg, Public vs private coin in bounded-round information, International Colloquium on Automata, Languages, and Programming, Springer, 2014, pp. 502-513.
- [FT23] Gergely Flamich and Lucas Theis, Adaptive greedy rejection sampling, 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023, pp. 454-459.
- [HJMR07] Prahladh Harsha, Rahul Jain, David McAllester, and Jaikumar Radhakrishnan, The communication complexity of correlation, Twenty-Second Annual IEEE Conference on Computational Complexity (CCC’07), IEEE, 2007, pp. 10-23.
- [HJMR10] Prahladh Harsha, Rahul Jain, David McAllester, and Jaikumar Radhakrishnan, The communication complexity of correlation, IEEE Transactions on Information Theory 56 (2010), no. 1, 438-449.
- [LA21] Cheuk Ting Li and Venkat Anantharam, A unified framework for one-shot achievability via the Poisson matching lemma, IEEE Transactions on Information Theory 67 (2021), no. 5, 2624-2651.
- [LEG18] Cheuk Ting Li and Abbas El Gamal, Strong functional representation lemma and applications to coding theorems, IEEE Transactions on Information Theory 64 (2018), no. 11, 6967-6978.
- [LHB22] Eric Lei, Hamed Hassani, and Shirin Saeedi Bidokhti, Neural estimation of the rate-distortion function with applications to operational source coding, IEEE Journal on Selected Areas in Information Theory 3 (2022), no. 4, 674-686.
- [Li24a] Cheuk Ting Li, Channel simulation: Theory and applications to lossy compression and differential privacy, Foundations and Trends in Communications and Information Theory 21 (2024), no. 6, 847-1106.
- [Li24b] Cheuk Ting Li, Pointwise redundancy in one-shot lossy compression via Poisson functional representation, International Zurich Seminar on Information and Communication (IZS 2024), 2024.
- [Li25] Cheuk Ting Li, Discrete layered entropy, conditional compression and a tighter strong functional representation lemma, 2025 IEEE International Symposium on Information Theory (ISIT), 2025. Full version: arXiv preprint arXiv:2501.13736.
- [TW21] Lucas Theis and Aaron B Wagner, A coding theorem for the rate-distortion-perception function, Neural Compression: From Information Theory to Applications-Workshop@ ICLR 2021, 2021.
- [ZOS23] Amirreza Zamani, Tobias J Oechtering, and Mikael Skoglund, On the privacy-utility trade-off with and without direct access to the private data, IEEE Transactions on Information Theory 70 (2023), no. 3, 2177-2200.