Fourier Entropy-Influence constant
Description of constant
Let $f:\{-1,1\}^n\to\{-1,1\}$ be a Boolean function with Fourier expansion $f(x)=\sum_{S\subseteq[n]}\hat f(S)\chi_S(x)$. Its spectral entropy is
\[H(\hat f^2)\ :=\ \sum_{S\subseteq[n]}\hat f(S)^2\log_2\frac{1}{\hat f(S)^2},\]and its total influence is
\[\mathrm{Inf}(f)\ :=\ \sum_{S\subseteq[n]}\hat f(S)^2\,\lvert S\rvert.\]Friedgut and Kalai conjectured that there is a universal constant $C>0$ such that
\[H(\hat f^2)\ \le\ C\,\mathrm{Inf}(f)\]for every Boolean $f$.
We define
\[C_{71}\ :=\ \inf\Bigl\{C>0:\ H(\hat f^2)\le C\,\mathrm{Inf}(f)\ \text{for all Boolean }f\Bigr\}.\]The conjecture is equivalent to $C_{71}<\infty$, and this remains open. [ODWZ2011-open-problem]
An explicit construction gives
\[C_{71}\ \ge\ 6.278.\]Hence the best established range is
\[6.278\ \le\ C_{71}\ \le\ \infty.\]Known upper bounds
| Bound | Reference | Comments |
|---|---|---|
| $\infty$ | No finite universal constant is currently known. [ODWZ2011-open-problem] |
Known lower bounds
| Bound | Reference | Comments |
|---|---|---|
| $0$ | Trivial bound from nonnegativity. | |
| $6.278$ | [OT2013] | Explicit example with ratio at least $6.278$. [OT2013-lb-6-278] |
Additional comments and links
References
- [ODWZ2011] O’Donnell, Ryan; Wright, John; Zhou, Yuan. The Fourier Entropy-Influence Conjecture for Certain Classes of Boolean Functions. In: Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques (APPROX/RANDOM 2011), Lecture Notes in Computer Science, pp. 330-341 (2011). DOI: https://doi.org/10.1007/978-3-642-22006-7_28. Author PDF: https://www.cs.cmu.edu/~odonnell/papers/fei.pdf. Google Scholar
- [ODWZ2011-conj-attr] loc: Author PDF p.1, Abstract quote: “In 1996, Friedgut and Kalai made the Fourier Entropy-Influence Conjecture: For every Boolean function $f : \{-1, 1\}^n \to \{-1, 1\}$ it holds that $H[\hat f^2] \le C \cdot I[f]$, where $H[\hat f^2]$ is the spectral entropy of $f$, $I[f]$ is the total influence of $f$, and $C$ is a universal constant.”
- [ODWZ2011-defs] loc: Author PDF p.2, Section 1, paragraph after the conjecture display quote: “The quantity $H[\hat{f}^2] = \sum \hat f(S)^2 \log \frac{1}{\hat f(S)^2}$ on the left is the spectral entropy or Fourier entropy of $f$. It ranges between $0$ and $n$ and measures how ‘spread out’ $f$’s Fourier spectrum is. The quantity $I[f] = \sum \hat f(S)^2\lvert S\rvert$ appearing on the right is the total influence or average sensitivity of $f$.”
- [ODWZ2011-open-problem] loc: Author PDF p.2, Section 1, paragraph beginning “One of the most longstanding…” quote: “One of the most longstanding and important open problems in the field is the Fourier Entropy-Influence (FEI) Conjecture made by Friedgut and Kalai in 1996 [6]:”
- [OT2013] O’Donnell, Ryan; Tan, Li-Yang. A Composition Theorem for the Fourier Entropy-Influence Conjecture. In: Automata, Languages, and Programming (ICALP 2013), Lecture Notes in Computer Science, pp. 780-791 (2013). DOI: https://doi.org/10.1007/978-3-642-39206-1_66. arXiv PDF: https://arxiv.org/pdf/1304.1347.pdf. Google Scholar
Contribution notes
Prepared with assistance from ChatGPT 5.2 Pro.