12 The \(m\)-torsion case
12.1 Data processing inequality
Let \(X\) be a random variable. Then for any function \(f\) on the range of \(X\), one has \(\mathbb {H}[f(X)] \leq \mathbb {H}[X]\).
We have
thanks to Lemma 2.2 and Lemma 2.13, giving the claim.
Let \(X,Y\) be random variables. For any function \(f, g\) on the range of \(X\), we have \(\mathbb {I}[f(X) : Y] \leq \mathbb {I}[X:Y]\).
By Lemma 2.16 it suffices to show that \(\mathbb {H}[Y|X] \leq \mathbb {H}[Y|f(X)]\). But this follows from Corollary 2.20 (and Lemma 2.2).
Let \(X,Y\) be random variables. For any functions \(f, g\) on the ranges of \(X, Y\) respectively, we have \(\mathbb {I}[f(X) : g(Y )] \leq \mathbb {I}[X : Y]\).
From Lemma 12.2, Lemma 2.9 we have \(\mathbb {I}[f(X) : Y] \leq \mathbb {I}[X:Y]\) and \(\mathbb {I}[f(X): g(Y)] \leq \mathbb {I}[f(X):Y]\), giving the claim.
Let \(X,Y,Z\). For any functions \(f, g\) on the ranges of \(X, Y\) respectively, we have \(\mathbb {I}[f(X) : g(Y )|Z] \leq \mathbb {I}[X :Y |Z]\).
Apply Lemma 12.3 to \(X,Y\) conditioned to the event \(Z=z\), multiply by \({\bf P}[Z=z]\), and sum using Definition 2.25.
12.2 More Ruzsa distance estimates
Let \(G\) be an additive group.
If \(X,Y\) are \(G\)-valued, then
Without loss of generality (using Lemma 3.10 and Lemma 3.7) we may take \(X,Y\) to be independent. By \((X_1,Y_1)\), \((X_2,Y_2)\) be copies of \((X,Y)\) that are conditionally independent over \(X_1-Y_1=X_2-Y_2\) (this exists thanks to Lemma 3.22). By Lemma 3.7, we can also find another copy \((X_3,Y_3)\) of \((X,Y)\) that is independent of \(X_1,Y_1,X_2,Y_2\). From Corollary 2.21, one has
From Lemma 3.11, Lemma 3.1, Lemma 3.10 we have
Since \(X_3+Y_3\) is a function of \(X_3,Y_3\), we see from Lemma 2.2 and Corollary 2.24 that
Because \(X_1-Y_1=X_2-Y_2\), we have
and thus by Lemma 2.2
and hence by Corollary 2.18
Since \(X_3,Y_2\) are independent, we see from Lemma 3.11, Lemma 3.10 that
Similarly
We conclude that
Finally, from Lemma 12.1 we have
From Corollary 2.24 followed by Corollary 2.30, we have
and thus by Lemma 3.11, Lemma 3.10, Lemma 2.2, Corollary 2.24
Applying all of these estimates, the claim now follows from linear arithmetic.
If \(n \geq 0\) and \(X, Y_1, \dots , Y_n\) are jointly independent \(G\)-valued random variables, then
This is trivial for \(n=0,1\), while the \(n=2\) case is Lemma 3.21. Now suppose inductively that \(n {\gt} 2\), and the claim was already proven for \(n-1\). By a further application of Lemma 3.21 one has
By induction hypothesis one has
Summing the two inequalities, we obtain the claim.
If \(n \geq 1\) and \(X, Y_1, \dots , Y_n\) are jointly independent \(G\)-valued random variables, then
Applying Lemma 12.6 with all the \(Y_i\) replaced by \(-Y_i\), and using Lemma 3.1 and Lemma 3.11, we obtain after some rearranging
From Corollary 3.5 we have
for all \(i\); subtracting \(\mathbb {H}[X]\) and averaging, we conclude that
and thus
From Lemma 3.13 we have
Since \(0 \leq \frac{n-1}{2n} \leq \frac{1}{2}\), the claim follows.
If \(n \geq 1\) and \(X, Y_1, \dots , Y_n\) are jointly independent \(G\)-valued random variables, then
From Lemma 3.21 one has
The claim then follows from Lemma 3.11 and some elementary algebra.
Let \((X_i)_{1 \leq i \leq m}\) and \((Y_j)_{1 \leq j \leq l}\) be tuples of jointly independent random variables (so the \(X\)’s and \(Y\)’s are also independent of each other), and let \(f: \{ 1,\dots ,l\} \to \{ 1,\dots ,m\} \) be a function, then
Write \(W := \sum _{i=1}^m X_i\). From Corollary 3.5 we have
while from Lemma 12.6 one has
From Lemma 3.21 one has
The claim now follows from Lemma 3.1 and some elementary algebra.
Let \(X,Y,X'\) be independent \(G\)-valued random variables, with \(X'\) a copy of \(X\), and let \(a\) be an integer. Then
and
From Lemma 3.17 we have
which gives the first inequality. Similarly from Lemma 3.17 we have
which (when combined with Lemma 3.1) gives the second inequality.
Let \(X,Y\) be independent \(G\)-valued random variables, and let \(a\) be an integer. Then
From Lemma 3.21 one has
which by Lemma 3.11 gives
and hence by Lemma 12.5
From Lemma 12.10 we then have
and the claim now follows by an induction on \(|a|\).
We remark that in the paper [GGMT2024] the variant estimate
is also proven by a similar method. This variant is superior for \(|a| \geq 9\) (or \(|a|=7\)); but we will not need this estimate here.
12.3 Multidistance
We continue to let \(G\) be an abelian group.
Let \(m\) be a positive integer, and let \(X_{[m]} = (X_i)_{1 \leq i \leq m}\) be an \(m\)-tuple of \(G\)-valued random variables \(X_i\). Then we define
where the \(\tilde X_i\) are independent copies of the \(X_i\).
If \(X_{[m]} = (X_i)_{1 \leq i \leq m}\) and \(Y_{[m]} = (Y_i)_{1 \leq i \leq m}\) are such that \(X_i\) and \(Y_i\) have the same distribution for each \(i\), then \(D[X_{[m]}] = D[Y_{[m]}]\).
Clear from Lemma 3.6.
If \(X_{[m]} = (X_i)_{1 \leq i \leq m}\) are jointly independent, then \(D[X_{[m]}] = \mathbb {H}[\sum _{i=1}^m X_i] - \frac{1}{m} \sum _{i=1}^m \mathbb {H}[X_i]\).
Clear from definition.
For any such tuple, we have \(D[X_{[m]}] \geq 0\).
From Corollary 3.5 one has
for each \(1 \leq i \leq m\). Averaging over \(i\), we obtain the claim.
If \(\phi : \{ 1,\dots ,m\} \to \{ 1,\dots ,m\} \) is a bijection, then \(D[X_{[m]}] = D[(X_{\phi (j)})_{1 \leq j \leq m}]\).
Trivial.
Let \(m \ge 2\), and let \(X_{[m]}\) be a tuple of \(G\)-valued random variables. Then
By Lemma 3.10, Lemma 12.13 we may take the \(X_i\) to be jointly independent. From Corollary 3.5, we see that for any distinct \(1 \leq j,k \leq m\), we have
and hence by Lemma 3.11
Summing this over all pairs \((j,k)\), \(j \neq k\) and using Lemma 12.14, we obtain the claim.
Let \(m \ge 2\), and let \(X_{[m]}\) be a tuple of \(G\)-valued random variables. Then
From Lemma 3.18 we have \(\operatorname{dist}{X_j}{X_j} \leq 2 \operatorname{dist}{X_j}{-X_k}\), and applying this to every summand in Lemma 12.17, we obtain the claim.
Let \(m \ge 2\), and let \(X_{[m]}\) be a tuple of \(G\)-valued random variables. If the \(X_i\) all have the same distribution, then \(D[X_{[m]}] \leq m d[X_i;X_i]\) for any \(1 \leq i \leq m\).
By Lemma 3.10, Lemma 12.13 we may take the \(X_i\) to be jointly independent. Let \(X_0\) be a further independent copy of the \(X_i\). From Lemma 12.6, we have
and hence by Lemma 3.1 and Lemma 3.11
On the other hand, by Corollary 3.5 we have
and the claim follows.
Let \(m \ge 2\), and let \(X_{[m]}\) be a tuple of independent \(G\)-valued random variables. Let \(W := \sum _{i=1}^m X_i\). Then
Take \((X'_i)_{1 \leq i \leq m}\) to be further independent copies of \((X_i)_{1 \leq i \leq m}\) (which exist by Lemma 3.7), and write \(W' := \sum _{i=1}^m X'_i\). Fix any distinct \(a,b \in I\).
From Lemma 3.21 one has
and also
Combining this with 1 and then applying Corollary 3.5 we have
Averaging this over all choices of \((a,b)\) gives \(\mathbb {H}[W] + 2 D[X_{[m]}]\), and the claim follows from Lemma 3.11.
If \(D[X_{[m]}]=0\), then for each \(1 \leq i \leq m\) there is a finite subgroup \(H_i \leq G\) such that \(d[X_i; U_{H_i}] = 0\).
From Lemma 12.17 and Lemma 3.15 we have \(d[X_j; X_{-k}]=0\) for all \(1 \leq j,k \leq m\). The claim now follows from Corollary 4.6.
With more effort one can show that \(H_i\) is independent of \(i\), but we will not need to do so here.
12.4 The tau functional
Fix \(m \geq 2\), and a reference variable \(X^0\) in \(G\).
We set \(\eta := \frac{1}{32m^3}\).
If \((X_i)_{1 \leq i \leq m}\) is a tuple, we define its \(\tau \)-functional
A \(\tau \)-minimizer is a tuple \((X_i)_{1 \leq i \leq m}\) that minimizes the \(\tau \)-functional among all tuples of \(G\)-valued random variables.
If \(G\) is finite, then a \(\tau \)-minimizer exists.
This is similar to the proof of Proposition 6.5.
If \((X_i)_{1 \leq i \leq m}\) is a \(\tau \)-minimizer, then \(\sum _{i=1}^m d[X_i; X^0] \leq \frac{2m}{\eta } d[X^0; X^0]\).
By Definition 12.24 we have
and hence by Definition 12.23 and Lemma 12.15
The claim now follows from Lemma 12.19.
If \((X_i)_{1 \leq i \leq m}\) is a \(\tau \)-minimizer, and \(k := D[(X_i)_{1 \leq i \leq m}]\), then for any other tuple \((X'_i)_{1 \leq i \leq m}\), one has
By Definition 12.24 we have
and hence by Definition 12.23
On the other hand, by Lemma 3.18 we have
The claim follows.
If \(X_{[m]} = (X_i)_{1 \leq i \leq m}\) and \(Y_{[m]} = (Y_i)_{1 \leq i \leq m}\) are tuples of random variables, with the \(X_i\) being \(G\)-valued (but the \(Y_i\) need not be), then we define
where each \(y_i\) ranges over the support of \(p_{Y_i}\) for \(1 \leq i \leq m\).
If the \((X_i,Y_i)\) are independent,
This is routine from Definition 2.11 and Definitions 12.12 and 12.28.
If \(X_{[m]} = (X_i)_{1 \leq i \leq m}\) and \(Y_{[m]} = (Y_i)_{1 \leq i \leq m}\) are tuples of random variables, then \(D[ X_{[m]} | Y_{[m]} ] \geq 0\).
Clear from Lemma 12.15 and Definition 12.28, except that some care may need to be taken to deal with the \(y_i\) where \(p_{Y_i}\) vanish.
If \((X_i)_{1 \leq i \leq m}\) is a \(\tau \)-minimizer, and \(k := D[(X_i)_{1 \leq i \leq m}]\), then for any other tuples \((X'_i)_{1 \leq i \leq m}\) and \((Y_i)_{1 \leq i \leq m}\) with the \(X'_i\) \(G\)-valued, one has
Immediate from Lemma 12.27, Lemma 12.29, and Definition 3.19.
With the notation of the previous lemma, we have
for any permutation \(\sigma : \{ 1,\dots ,m\} \rightarrow \{ 1,\dots ,m\} \).
This follows from Lemma 12.31 and Lemma 12.16.
12.5 The multidistance chain rule
Let \(\pi \colon G \to H\) be a homomorphism of abelian groups and let \(X_{[m]}\) be a tuple of jointly independent \(G\)-valued random variables. Then \(D[X_{[m]}]\) is equal to
where \(\pi (X_{[m]}) := (\pi (X_i))_{1 \leq i \leq m}\).
For notational brevity during this proof, write \(S := \sum _{i=1}^m X_i\).
From Lemma 2.26 and Lemma 2.2, noting that \(\pi (S)\) is determined both by \(S\) and by \(\pi (X_{[m]})\), we have
and by Lemma 2.13 the right-hand side is equal to
Therefore,
From a further application of Lemma 2.13 and Lemma 2.2 we have
for all \(1 \leq i \leq m\). Averaging 7 in \(i\) and subtracting this from 6, we obtain the claim from Definition 12.12.
We will need to iterate the multidistance chain rule, so it is convenient to observe a conditional version of this rule, as follows.
Let \(\pi \colon G \to H\) be a homomorphism of abelian groups. Let \(I\) be a finite index set and let \(X_{[m]}\) be a tuple of \(G\)-valued random variables. Let \(Y_{[m]}\) be another tuple of random variables (not necessarily \(G\)-valued). Suppose that the pairs \((X_i, Y_i)\) are jointly independent of one another (but \(X_i\) need not be independent of \(Y_i\)). Then
For each \(y_i\) in the support of \(p_{Y_i}\), apply Lemma 12.33 with \(X_i\) replaced by the conditioned random variable \((X_i|Y_i=y_i)\), and the claim 8 follows by averaging 5 in the \(y_i\) using the weights \(p_{Y_i}\).
We can iterate the above lemma as follows.
Let \(m\) be a positive integer. Suppose one has a sequence
of homomorphisms between abelian groups \(G_0,\dots ,G_m\), and for each \(d=0,\dots ,m\), let \(\pi _d : G_m \to G_d\) be the homomorphism from \(G_m\) to \(G_d\) arising from this sequence by composition (so for instance \(\pi _m\) is the identity homomorphism and \(\pi _0\) is the zero homomorphism). Let \(X_{[m]} = (X_i)_{1 \leq i \leq m}\) be a jointly independent tuple of \(G_m\)-valued random variables. Then
In particular, by Lemma 2.27,
From Lemma 12.34 (taking \(Y_{[m]} = \pi _{d-1}(X_{[m]})\) and \(\pi = \pi _d\) there, and noting that \(\pi _d(X_{[m]})\) determines \(Y_{[m]}\)) we have
for \(d=1,\dots ,m\). The claim follows by telescoping series, noting that \(D[X_{[m]} | \pi _0(X_{[m]})] = D[X_{[m]}]\) and that \(\pi _m(X_{[m]})=X_{[m]}\) (and also \(\pi _m( \sum _i X_i ) = \sum _i X_i\)).
In our application we will need the following special case of the above lemma.
Let \(G\) be an abelian group and let \(m \geq 2\). Suppose that \(X_{i,j}\), \(1 \leq i, j \leq m\), are independent \(G\)-valued random variables. Then
where all the multidistances here involve the indexing set \(\{ 1,\dots , m\} \).
In Lemma 12.35 we take \(G_d := G^d\) with the maps \(\pi _d \colon G^m \to G^d\) for \(d=1,\dots ,m\) defined by
with \(\pi _0=0\). Since \(\pi _{d-1}(x)\) can be obtained from \(\pi _{d}(x)\) by applying a homomorphism, we obtain a sequence of the form 9.
Now we apply Lemma 12.35 with \(I = \{ 1,\dots , m\} \) and \(X_i := (X_{i,j})_{j = 1}^m\). Using joint independence and Corollary 2.24, we find that
On the other hand, for \(1 \leq j \leq m-1\), we see that once \(\pi _{j}(X_i)\) is fixed, \(\pi _{j+1}(X_i)\) is determined by \(X_{i, j}\) and vice versa, so
Since the \(X_{i,j}\) are jointly independent, we may further simplify:
Putting all this into the conclusion of Lemma 12.35, we obtain
and the claim follows by rearranging.
12.6 Bounding the mutual information
As before, \(G\) is an abelian group, and \(m\geq 2\). We let \(X_{[m]} = (X_i)_{i =1}^m\) be a \(\tau \)-minimizer.
Suppose that \(X_{i,j}\), \(1 \leq i,j \leq m\), are jointly independent \(G\)-valued random variables, such that for each \(j = 1,\dots ,m\), the random variables \((X_{i,j})_{i = 1}^m\) coincide in distribution with some permutation of \(X_{[m]}\). Write
Then
For each \(j \in \{ 1,\dots ,m\} \) we call the tuple \((X_{i,j})_{i = 1}^m\) a column and for each \(i \in \{ 1,\dots , m\} \) we call the tuple \((X_{i,j})_{j = 1}^m\) a row. Hence, by hypothesis, each column is a permutation of \(X_{[m]} = (X_i)_{i=1}^m\).
From Corollary 12.36 we have
where
and
We first consider the \(A_j\), for fixed \(j \in \{ 1,\dots , m-1\} \). By Lemma 12.16 and and our hypothesis on columns, we have
Let \(\sigma = \sigma _j \colon I \to I\) be a permutation such that \(X_{i,j} = X_{\sigma (i)}\), and write \(X'_i := X_{i,j}\) and \(Y_i := X_{i,j} + \cdots + X_{i,m}\). Corollary 12.32, we have
We similarly consider \(B\). By Lemma 12.16 applied to the \(m\)-th column,
For \(1 \leq i \leq m\), denote the sum of row \(i\) by
if we apply Corollary 12.32 again, now with \(X_{\sigma (i)} = X_{i,m}\), \(X'_i := V_i\), and with the variable \(Y_i\) being trivial, we obtain
It remains to bound the distances appearing in 16 and 17 further using Ruzsa calculus. For \(1 \leq j \leq m-1\) and \(1 \leq i \leq m\), by Lemma 3.25 we have
For each \(i\), summing over \(j = 1,\dots , m-1\) gives
On the other hand, by Lemma 12.8 (since \(X_{i,m}\) appears in the sum \(V_i\)) we have
Combining 15, 16 and 17 with 18 and 19 (the latter two summed over \(i\)), we get
By Lemma 12.9 (with \(f\) taking each \(j\) to the index \(j'\) such that \(X_{i,j}\) is a copy of \(X_{j'}\)) we obtain the bound
Finally, summing over \(i\) and using \(D[X_{[m]}] = k\) gives
where in the second step we used the permutation hypothesis. Combining this with 20 gives the
The claim 14 is now immediate from Lemma 12.18.
12.7 Endgame
Now let \(m \geq 2\), let \(G\) be an \(m\)-torsion abelian group, and let \((X_i)_{1 \leq i \leq m}\) be a \(\tau \)-minimizer.
By a slight abuse of notation, we identify \(\mathbb {Z}/m\mathbb {Z}\) and \(\{ 1,\dots ,m\} \) in the obvious way, and let \(Y_{i,j}\) be an independent copy of \(X_i\) for \(i,j \in \mathbb {Z}/m\mathbb {Z}\). Then also define:
and
The addition \((-i-j)\) takes place over \(\mathbb {Z}/m\mathbb {Z}\). Note that, because we are assuming \(G\) is \(m\)-torsion, it is well-defined to multiply elements of \(G\) by elements of \(\mathbb {Z}/m\mathbb {Z}\). We will also define for \(i,j,r \in \mathbb {Z}/m\mathbb {Z}\) the variables
We have
Clear from definition.
We have
where
We analyze these variables by Proposition 12.37 in several different ways. In the first application, take \(X_{i,j}=Y_{i,j}\). Note that each column \((X_{i,j})_{i=1}^m\) is indeed a permutation of \(X_1,\dots ,X_m\); in fact, the trivial permutation. Note also that for each \(i \in \mathbb {Z}/m\mathbb {Z}\), the row sum is
and for each \(j \in \mathbb {Z}/m\mathbb {Z}\), the column sum is
Finally note that \(\sum _{i,j=1}^m X_{i,j} = W\). From Proposition 12.37 we then have
with \(t\) as in 23. Since \(Z_1\) is a function of \((P_i)_{i \in \mathbb {Z}/m\mathbb {Z}}\) by 21, and similarly \(Z_2\) is a function of \((Q_j)_{j \in \mathbb {Z}/m\mathbb {Z}}\), it follows immediately from Lemma 12.4 that
In the second application of Proposition 12.37, we instead consider \(X'_{i,j} = Y_{i-j,j}\). Again, for each fixed \(j\), the tuple \((X'_{i,j})_{i=1}^m\) is a permutation of \(X_1,\dots ,X_m\). This time the row sums for \(i \in \{ 1,\dots , m\} \) are
Similarly, the column sums for \(j \in \{ 1,\dots , m\} \) are
As before, \(\sum _{i,j=1}^m X'_{i,j} = W\). Hence, using 21 and Lemma 12.4 again, Proposition 12.37 tells us
In the third application 1 of Proposition 12.37, take \(X''_{i,j} = Y_{i,j-i}\). The column and row sums are respectively
and
Hence, Proposition 12.37 and Lemma 12.4 give
which completes the proof.
We have \(\mathbb {H}[W] \leq (2m-1)k + \frac1m \sum _{i=1}^m \mathbb {H}[X_i]\).
Without loss of generality, we may take \(X_1,\dots ,X_m\) to be independent. Write \(S = \sum _{i=1}^m X_i\). Note that for each \(j \in \mathbb {Z}/m\mathbb {Z}\), the sum \(Q_j\) from 21 above has the same distribution as \(S\). By Lemma 12.6 we have
By Lemma 12.20, we have
and hence
From Definition 12.12 we have
and the claim follows.
We have \(\mathbb {H}[Z_2] \leq (8m^2-16m+1) k + \frac{1}{m} \sum _{i=1}^m \mathbb {H}[X_i]\).
We observe
Applying Lemma 12.6 one has
Using Lemma 12.11 and 24 we get
Applying 25 gives the claim.
We have \(\mathbb {I}[W : Z_2] \leq 2 (m-1) k\).
From Lemma 2.16 we have \(\mathbb {I}[W : Z_2] = \mathbb {H}[W] - \mathbb {H}[W | Z_2]\), and since \(Z_2 = \sum _{j=1}^{m-1} j Q_j\) and \(W = \sum _{j=1}^m Q_j\),
Hence, by Lemma 12.41,
as claimed.
We have \(\sum _{i=1}^m d[X_i;Z_2|W] \leq 8(m^3-m^2) k\).
For each \(i \in \{ 1,\dots , m\} \), using Lemma 12.8 (noting the sum \(Z_2\) contains \(X_i\) as a summand) we have
and using Lemma 3.24 we have
Combining with 26 and Lemma 12.43 gives
Summing over \(i\) and applying Lemma 12.42 gives
Finally, applying Lemma 12.18 (and dropping some lower order terms) gives the claim.
Let \(G\) be an abelian group, let \((T_1,T_2,T_3)\) be a \(G^3\)-valued random variable such that \(T_1+T_2+T_3=0\) holds identically, and write
Let \(Y_1,\dots ,Y_n\) be some further \(G\)-valued random variables and let \(\alpha {\gt}0\) be a constant. Then there exists a random variable \(U\) such that
We apply Lemma 3.23 with \(X=T_1\) and \(Y=T_2\). Since \(T_1+T_2=-T_3\), we find that
where the last line follows from Lemma 2.2 by observing
since any two of \(T_1,T_2,T_3\) determine the third.
By 28 and the triangle inequality,
and by Lemma 3.25, for each \(Y_i\),
Hence,
and the result follows by setting \(U=(T_2 \, |\, T_3 \mathop{=} z)\) for some \(z\) such that the quantity in parentheses on the left-hand side is at most the weighted average value.
We have \(k = 0\).
For each value \(W=w\), apply Lemma 12.45 (and Lemma 12.39) to
with \(Y_i=X_i\) and \(\alpha =\eta /m\). Write
for this choice, and note that
by Proposition 12.40. Write \(U_w\) for the random variable guaranteed to exist by Lemma 12.45, so that 27 gives
Let \((U_w)_I\) denote the tuple consisting of the same variable \(U_w\) repeated \(m\) times. By Lemma 12.19
On the other hand, from Lemma 12.27 one has
Combining 30, 31 and 32 and averaging over \(w\) (with weight \(p_W(w)\)), and recalling the value \(\alpha =\eta /m\), gives
since the terms \(d[X_i; U_w]\) cancel by our choice of \(\alpha \). Substituting in Lemma 12.44 and 29, and using the fact that \(2 + \frac{\eta }{2} {\lt} 3\), we have
From Definition 12.22 we have we have
and hence \(k \leq 0\). The claim now follows from Lemma 12.15.
12.8 Wrapping up
Suppose that \(G\) is a finite abelian group of torsion \(m\). Suppose that \(X\) is a \(G\)-valued random variable. Then there exists a subgroup \(H \leq G\) such that
Set \(X^0 := X\). By Proposition 12.25, there exists a \(\tau \)-minimizer \(X_{[m]} = (X_i)_{1 \leq i \leq m}\). By Proposition 12.46, we have \(D[X_{[m]}]=0\). By Proposition 12.26 and the pigeonhole principle, there exists \(1 \leq i \leq m\) such that \(d[X_i; X] \leq \frac{2}{\eta } d[X;X]\). By Proposition 12.21, we have \(d[X_i;U_H]=0\) for some subgroup \(H \leq G\), hence by Lemma 3.18 we have \(d[U_H; X] \leq \frac{2}{\eta } d[X;X]\). The claim then follows from Definition 12.22.
Suppose that \(G\) is a finite abelian group of torsion \(m\). If \(A \subset G\) is non-empty and \(|A+A| \leq K|A|\), then \(A\) can be covered by at most \(K ^{(64m^3+2)/2}|A|^{1/2}/|H|^{1/2}\) translates of a subspace \(H\) of \(G\) with
Repeat the proof of Lemma 7.2, but with Theorem 12.47 in place of Theorem 6.24.
Suppose that \(G\) is a finite abelian group of torsion \(m\). If \(A \subset G\) is non-empty and \(|A+A| \leq K|A|\), then \(A\) can be covered by most \(mK^{96m^3+2}\) translates of a subspace \(H\) of \(G\) with \(|H| \leq |A|\).
Repeat the proof of Theorem 7.3, but with Lemma 12.48 in place of Lemma 7.2.