19

Let $a_1,\ldots,a_n$ be real numbers such that $\sum_i a_i^2 =1$ and let $X_1,\ldots,X_n$ be independent, uniformly distributed, Bernoulli $\pm 1$ random variables. Define the random variable

$S:= \sum_i X_i a_i $

Are there absolute constants $\epsilon >0$ and $\delta<1$ such that for every $a_1,\ldots,a_n$

${\mathbb P} [|S| \leq \epsilon ] \leq \delta$ ?

  • I don't see how the $c_2n^{-c_3}$ term can help at all. You can always increase $n$ by adding extra $a_i$ which are set to zero, so this term can be made as small as you like. – George Lowther Jan 29 '11 at 02:36
  • 1
    So the answer to your question is no. Take $a_1=a_2=1/\sqrt{2}$ and all other $a_i$ set to zero. Then $P(|S|=0)=1/2$. – George Lowther Jan 29 '11 at 02:39
  • I presume this is cheaper than the poser intended. What if all $a_i$ are non-zero? – Anthony Quas Jan 29 '11 at 03:01
  • 1
  • Luca - can you help with the quantifiers here? George's example seems to show that there's no way to find universal $c_1$, $c_2$ and $c_3$ for which this is true for all $(a_i)$ of norm 1 - even if all the terms are non-zero. Obviously if the $c$'s are allowed to depend on $n$ it's trivial as you can just make $c_2\cdot n^{-c_3}$ exceed 1. – Anthony Quas Jan 29 '11 at 03:09
  • @Anthony: You can still have $a_3$ through $a_n$ as close to zero as you like, so there's no bound vanishing uniformly in $\epsilon$. – George Lowther Jan 29 '11 at 03:10
  • It is true that, for any $\epsilon < 1$ is a bound $\delta < 1$ with $\mathbb{P}(|S| < \epsilon) < \delta$ (independently of $n$ and $a_i$). This is one half of the $L^0$ version of the Khintchine inequality. However, $\delta$ does not go to zero as $\epsilon$ goes to zero. My example shows that $\delta\ge1/2$ for instance. – George Lowther Jan 29 '11 at 03:15
  • Oops I didn't think of the obvious counterexample! But I am actually happy with the probability being uniformly bounded away from 1, and with George's answer above. Are there explicit bounds for $\delta$? – Luca Trevisan Jan 29 '11 at 03:42

3 Answers3

19

The answer to your amended question is yes. In fact, for any $\epsilon\in[0,1)$ we have $$ \mathbb{P}(\vert S\vert > \epsilon)\ge (1-\epsilon^2)^2/3. $$ So, we can take $\delta = 1-(1-\epsilon^2)^2/3$. This is the $L^0$ version of the Khintchine inequality.

To prove it, you can use $\mathbb{E}[X_iX_j^3]=0$ for $i\not=j$ and $X_i^4=X_i^2X_j^2=1$ to get $$ \begin{align} \mathbb{E}[S^4]&=\sum_ia_i^4+3\sum_{i\not=j}a_i^2a_j^2=3\left(\sum_ia_i^2\right)^2-2\sum_ia_i^4\\\\ &\le 3. \end{align} $$ The Paley-Zygmund inequality gives $$ \begin{align} \mathbb{P}(\vert S\vert >\epsilon)&\ge(1-\epsilon^2)^2\frac{\mathbb{E}[S^2]^2}{\mathbb{E}[S^4]}\\\\ &\ge(1-\epsilon^2)^2/3. \end{align} $$

This bound gives $\delta=2/3$ for $\epsilon=0$. By considering the example with $a_1=a_2=1/\sqrt{2}$ and $a_i=0$ for $i > 2$, which satisfies $\mathbb{P}(S=0)=1/2$ we see that it is necessary that $\delta\ge1/2$. In fact, a simple argument noting that the distribution is symmetric under a sign change for $X_1$ (as mentioned by Luca in the comments) shows that $\mathbb{P}(S=0)\le1/2$.

See also the paper On Khintchine inequalities with a weight, where they prove the same bound as I just did above. Also, using the optimal constants for the $L^p$ Khintchine inequality, as in Lemma 3 of that paper, gives an improved bound for $\mathbb{P}(\vert S\vert\le\epsilon)$ tending to $1-2e^{-2+\gamma}\approx0.517$ as $\epsilon$ goes to zero, which is close to optimal.

George Lowther
  • 16,787
  • 1
  • 63
  • 96
  • Can't you prove ${\mathbb P} (|S|=0 ) \leq 1/2$ just by observing that, if wlog $a_1\neq 0$, then, for every $x_2,\ldots,x_n$, the vectors $(+1,x_2,\ldots,x_n)$ and $(-1,x_2,\ldots,x_n)$ give a different value of $S$, and so $S$ can be zero in at most one of the two cases. – Luca Trevisan Jan 30 '11 at 21:29
  • Yes, that's true. I was a bit unsure about that statement, and noticed that rather simple argument (after posting). I edited it now. – George Lowther Jan 31 '11 at 00:57
  • Three remarks regarding George's answer for arbitrary $\epsilon$: (a.) The same anticoncentration holds not just around 0 but around any number. (b.) The bound holds with a constant worse than 3 if the $X_i$'s are any independent, not necessarily identical random variables with mean zero and $|X_i|_4/|X_i|_2 \leq C$ for all $i$. (The replacement for constant 3 will depend polynomially on $C$.) See, e.g., Proposition 3.7 here: http://www.cs.cmu.edu/~odonnell/papers/prgs-ltfs.pdf (c.) The $X_i$'s in fact only need to be 4-wise independent, not fully independent. – Ryan O'Donnell Jun 25 '13 at 18:51
4

Your question is part of what's called Littlewood-Offord theory, which has seen a lot of progress lately in work of Tao and Vu and of Rudelson and Vershynin. Take a look at Section 1.2 and especially Theorem 1.5 of this paper by Rudelson and Vershynin for more precise results than in George's answer. (Incidentally, that paper also contains arguments based on the central limit theorem, in the Berry-Esseen form, along the lines of what Igor suggests.)

Mark Meckes
  • 11,286
  • One further question which comes to mind (not related to the linked paper though) is the following: Is there a $\delta < 1$ (independent of $a_i$) with $\mathbb{P}(\vert S\vert < 1)\le\delta$? The form of the Khintchine inequality that I know only gives a nontrivial bound for $\mathbb{P}(\vert S\vert < \epsilon)$ with $\epsilon < 1$. – George Lowther Jan 29 '11 at 15:35
  • Your link (to the Rudelson-Vershynin paper) is broken. – dohmatob Aug 30 '20 at 18:00
  • @dohmatob Fixed. – Mark Meckes Aug 31 '20 at 13:02
3

I think that the point is to observe that for $a_i$ all equal to $1/\sqrt{n},$ the statement follows from the central limit theorem (since $1/\sqrt{n}$ is precisely the needed normalizing factor). When the $a_i$ are sufficiently slowly varying, you again get a central limit theorem, and hence a similar result (if you look at Feller v 2, section XVI.5, you will see that a condition like $$\lim_{n\rightarrow \infty} {\sum a_i^3} = 0$$ is sufficient, though it is probably too strong).

Igor Rivin
  • 95,560