2

Let $X_1,...,X_N$ be IID, mean-zero random variables whose tail is bounded by a subgaussian-tailed variable to the fourth moment, i.e., for some $t \ge t_0 > 0$ $$ P(|X_i| > t) \le C\exp\left( -C t^{1/2} \right) $$ Then what is the best upper bound one can get for the $$ P\left( \left|\frac{1}{\sqrt{N}} \sum_{i=1}^N X_i \right| > t\right) $$ ?

  • See if part 2 of my answer to this post help? http://mathoverflow.net/questions/102185/how-fast-can-extreme-eigenvalues-of-the-average-of-random-matrices-converge-to-t/263689#263689 – Henry.L Mar 10 '17 at 15:56

1 Answers1

3

For $\gamma\gt 0$ and a random variable $X$, define $$\lVert X\rVert_{\Phi_\gamma}:=\inf\left\{c\gt 0;\mathbb E\left[\exp\left(\left|X/c\right|^\gamma\right)\right] \leqslant 2\right\}.$$ For $\gamma\geqslant 1$, this defines a norm, and for $0\lt \gamma\lt 1$, this is equivalent to a norm. Obverse that Markov's inequality and the monotone convergence theorem entail $$\mathbb P \left\{\left|X\right|\geqslant x\right\}\leqslant 2\exp\left(-\frac{x^\gamma}{\lVert X\rVert^\gamma_{\Phi_\gamma}}\right),$$ and rewriting the expectation of the exponential in terms of norm, we derive that if $Y$ satisfies $$\mathbb P \left\{\left|Y\right|\geqslant t\right\}\leqslant K\exp\left(-\frac{t^\gamma}{\lambda^\gamma}\right)$$ for each positive $t$, then $\lVert Y\rVert_{\Phi_\gamma}\leqslant \left(1+K/2\right)^{1/\gamma}\lambda$. Therefore, the problem reduces to find a good upper bound for $\lVert \sum_{i=1}^nX_i\rVert_{\Phi_\gamma}$. This is precisely the purpose of Theorems 3 and 4 in Isoperimetry and integrability of the sum of independent Banach-space valued random variables by Michel Talagrand. In particular, this gives that if $X_1$ satisfies $$\mathbb P \left\{\left|X_1\right|\geqslant x\right\}\leqslant c_1\exp\left(-c_2t^\gamma\right)$$ then the quantity $\lVert \sum_{i=1}^nX_i\rVert_{\Phi_\gamma}/\sqrt n$ can be bounded independently of $n$.

Davide Giraudo
  • 3,883
  • 2
  • 29
  • 42