Ising perceptron capacity threshold
Description of constant
Let $G = (g_{ij})$ be an $M \times N$ random matrix with independent standard Gaussian entries, and let \(Z(G) := \left| \left\{ \sigma \in \{-1,1\}^N : G \sigma \geq 0 \text{ coordinatewise} \right\} \right|.\) This is the zero-margin binary (or Ising) perceptron. Write $M = \lfloor \alpha N \rfloor$.
Define $C_{80}$ to be the infimum of all $\alpha > 0$ such that \(\mathbb{P}(Z(G) > 0) \to 0 \qquad \text{as } N \to \infty.\) Equivalently, $C_{80}$ is the asymptotic storage-capacity / satisfiability threshold of the zero-margin Ising perceptron. Sharp-threshold results [X2021], [NS2023] show that this formulation captures the threshold location up to $o(1)$.
Known upper bounds
| Bound | Reference | Comments |
|---|---|---|
| $<1$ | [KR1998] | Kim–Roche prove that the capacity is bounded away from $1$ with high probability. |
| $0.9963$ | [KR1998] | Explicit combinatorial upper bound. |
| $<1$ | [T1999] | Independent non-explicit upper bound of the form $1-\varepsilon$ for some $\varepsilon>0$. |
| $0.847$ | [AT2024] | Complete proof of the upper bound outlined earlier by Krauth–Mézard. |
| $\alpha_{\star} \approx 0.833$ | [H2024] | Conditional on an explicit numerical maximization hypothesis for a two-variable function $\mathscr{S}_{\star}(\lambda_1,\lambda_2)$. |
Known lower bounds
| Bound | Reference | Comments |
|---|---|---|
| $0$ | Trivial | |
| $>0$ | [KR1998] | Shows the capacity is bounded away from zero with high probability. |
| $\alpha_{\star} \approx 0.833$ | [DS2025], [NS2023] | Conditional on an explicit numerical maximization hypothesis for a one-variable function $\mathscr{S}{\star}(\lambda)$; Ding–Sun prove the lower bound with positive probability, and the sharp-threshold theory of Nakajima–Sun upgrades this to with high probability for every $\alpha < \alpha{\star}$. |
Additional comments
- The physics prediction of Krauth–Mézard [KM1989] is that $C_{80} = \alpha_{\star} \approx 0.833$.
- The literature uses both the names binary perceptron and Ising perceptron for this model.
- Huang [H2024], together with Ding–Sun [DS2025], gives a conditional proof of the Krauth–Mézard prediction.
- The explicit upper bound $0.9963$ comes from Kim–Roche [KR1998]; Talagrand [T1999] obtained an independent non-explicit upper bound.
- There are several nearby variants, including the symmetric Ising perceptron, the spherical perceptron, and nonzero-margin perceptrons.
References
- [KM1989] Krauth, Werner; Mézard, Marc. “Storage capacity of memory networks with binary couplings.” Journal de Physique 50, no. 20 (1989): 3057–3066. DOI: 10.1051/jphys:0198900500200305700
- [KR1998] Kim, Jeong Han; Roche, James R. “Covering Cubes by Random Half Cubes, with Applications to Binary Neural Networks.” Journal of Computer and System Sciences 56, no. 2 (1998): 223–252. DOI: 10.1006/jcss.1997.1560
- [T1999] Talagrand, Michel. “Intersecting random half cubes.” Random Structures & Algorithms 15, no. 3-4 (1999): 436–449. DOI: 10.1002/(SICI)1098-2418(199910/12)15:3/4%3C436::AID-RSA11%3E3.0.CO;2-5
- [X2021] Xu, Changji. “Sharp threshold for the Ising perceptron model.” The Annals of Probability 49, no. 5 (2021): 2399–2415. DOI: 10.1214/21-AOP1511
- [NS2023] Nakajima, Shuta; Sun, Nike. “Sharp threshold sequence and universality for Ising perceptron models.” In Proceedings of the 2023 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), 638–674 (2023). DOI: 10.1137/1.9781611977554.ch28
- [AT2024] Altschuler, Dylan J.; Tikhomirov, Konstantin. “A note on the capacity of the binary perceptron.” arXiv (2024), arXiv:2401.15092
- [H2024] Huang, Brice. “Capacity Threshold for the Ising Perceptron.” In 2024 IEEE 65th Annual Symposium on Foundations of Computer Science (FOCS), 1126–1136 (2024). DOI: 10.1109/FOCS61266.2024.00074; preprint at arXiv:2404.18902
- [DS2025] Ding, Jian; Sun, Nike. “Capacity lower bound for the Ising perceptron.” Probability Theory and Related Fields 193, no. 3-4 (2025): 627–715. DOI: 10.1007/s00440-025-01364-x
Contribution notes
Prepared with assistance from GPT5.4 Pro for a more in-depth literature search.