Dual matrix multiplication exponent
Description of constant
In algebraic complexity theory, for each real $k \ge 0$, let $\omega(k)$ denote the exponent for multiplying an $n \times n^k$ matrix by an $n^k \times n$ matrix. We define \(\alpha := \sup\{k \ge 0 : \omega(k) = 2\}.\) Thus $\alpha$ is the largest aspect-ratio exponent for which rectangular matrix multiplication can still be performed in essentially quadratic arithmetic time. [LGU2018-def-omega] [LGU2018-def-alpha]
We define \(C_{15b} := \alpha.\)
The current best established range is \(0.321334 < C_{15b} \le 1.\) [WXXZ2024-alpha-0321334] [CLLZ2025-ub-1]
Known upper bounds
| Bound | Reference | Comments |
|---|---|---|
| $1$ | [CLLZ2025] | Current best general upper bound. [CLLZ2025-ub-1] |
Known lower bounds
| Bound | Reference | Comments |
|---|---|---|
| $0$ | Trivial: $\omega(0)=2$ since multiplying an $n\times 1$ matrix by a $1\times n$ matrix is an outer product. | |
| $0.172$ | [LGU2018] | Coppersmith’s 1982 theorem yields $\omega(0.172)=2$. [LGU2018-alpha-0172] |
| $0.29462$ | [LG2012] | Previous record due to Coppersmith (1997), as recorded in Le Gall’s 2012 paper. [LG2012-alpha-029462-030298] |
| $0.30298$ | [LG2012] | Le Gall’s 2012 improvement. [LG2012-alpha-029462-030298] |
| $0.31389$ | [LGU2018] | Improvement from the fourth-power Coppersmith–Winograd analysis. [LGU2018-alpha-031389] |
| $0.321334$ | [WXXZ2024] | Current best published lower bound. [WXXZ2024-alpha-0321334] |
Additional comments and links
-
Algorithmic relevance. Better bounds for rectangular matrix multiplication improve algorithms for all-pairs shortest paths and sparse matrix multiplication. [LG2012-applications]
-
Wikipedia page on computational complexity of matrix multiplication
References
- [CLLZ2025] Christandl, Matthias; Le Gall, François; Lysikov, Vladimir; Zuiddam, Jeroen. Barriers for rectangular matrix multiplication. Computational Complexity 34 (2025), article 4. DOI: 10.1007/s00037-025-00264-9. Preprint: arXiv:2003.03019 PDF. Google Scholar
- [LG2012] Le Gall, François. Faster Algorithms for Rectangular Matrix Multiplication. In: 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science (FOCS 2012), 514–523. DOI: 10.1109/FOCS.2012.80. Preprint: arXiv:1204.1111 PDF. Google Scholar
- [LG2012-alpha-029462-030298] loc: arXiv PDF p.1, Abstract quote: “In this paper we show that $\alpha>0.30298$, which improves the previous record $\alpha>0.29462$ by Coppersmith (Journal of Complexity, 1997).”
- [LG2012-applications] loc: arXiv PDF p.1, Abstract quote: “For example, we directly obtain a $O(n^{2.5302})$-time algorithm for the all-pairs shortest paths problem over directed graphs with small integer weights, improving over the $O(n^{2.575})$-time algorithm by Zwick (JACM 2002), and also improve the time complexity of sparse square matrix multiplication.”
- [LGU2018] Le Gall, François; Urrutia, Florent. Improved Rectangular Matrix Multiplication using Powers of the Coppersmith-Winograd Tensor. In: Proceedings of the Twenty-Ninth Annual ACM-SIAM Symposium on Discrete Algorithms (SODA 2018), 1029–1046. DOI: 10.1137/1.9781611975031.67. Preprint: arXiv:1708.05622 PDF. Google Scholar
- [LGU2018-def-omega] loc: arXiv PDF p.3, §1.1 “Background” quote: “In analogy to the square case, the exponent of rectangular matrix multiplication, denoted $\omega(k)$, is defined as the minimum value such that this product can be computed using $O(n^{\omega(k)+\epsilon})$ arithmetic operations for any constant $\epsilon>0$.”
- [LGU2018-def-alpha] loc: arXiv PDF p.3, §1.1 “Background” quote: “$\alpha = \sup{k \mid \omega(k)=2}$.”
- [LGU2018-alpha-0172] loc: arXiv PDF p.3, §1.1 “Background” quote: “Coppersmith [8] showed in 1982 that $\omega(0.172)=2$.”
- [LGU2018-alpha-031389] loc: arXiv PDF p.4, §1.2 “Our results” quote: “$\alpha \ge 0.31389$”
- [WXXZ2024] Vassilevska Williams, Virginia; Xu, Yinzhan; Xu, Zixuan; Zhou, Renfei. New Bounds for Matrix Multiplication: from Alpha to Omega. In: Proceedings of the 2024 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA 2024), 3792–3835. DOI: 10.1137/1.9781611977912.134. Preprint: arXiv:2307.07970 PDF. Google Scholar
Contribution notes
Prepared with assistance from ChatGPT 5.2 Pro.