# Anti-concentration of Gaussian quadratic form

MathOverflow Asked by Mitch on January 1, 2021

Let $X_1,dots,X_n$ denote i.i.d. standard Gaussian random variables. My question is: Do there exist absolute constants $C,c>0$ such that for every $epsilon>0$ and positive real numbers $a_1,dots,a_n>0$, we have the following bound

$quad Pr(sum_{i=1}^n a_i X_i^2 le epsilon sum_{i=1}^n a_i)le Cepsilon^cquad ?$

Background: Results of Carbery and Wright (2001) give powerful anti-concentration inequalities for polynomials of Gaussian variables. If we replace the term $epsilon sum_{i=1}^n a_i$ by $epsilonsqrt{sum_{i=1}^n a_i^2}$ the bound follows directly from Carbery-Wright with $c=1/2$. Here, however, we need a stronger bound in terms of the $ell_1$-norm rather than $ell_2$-norm of the polynomial.

The reason why I believe the claim is still true is that all the variables $X_i$ appear squared so that there are no cancellations. Also, for $n=1$ the claim follows from simple Gaussian anti-concentration bounds with $c=1/2.$

Note: A simple application of Paley-Zygmund gives the bound $Pr(sum a_i X_i^2 ge epsilonsum a_i )ge (1-epsilon)^2/3.$ This, unfortunately does not imply the statement above.

We can show that $$mathbb{P}left(sum_ia_iX_i^2leepsilonsum_ia_iright)lesqrt{eepsilon}$$ so that the inequality holds with $c=1/2$ and $C=sqrt{e}$.

For $epsilonge1$ the right hand side is greater than 1, so the inequality is trivial. I'll prove the case with $epsilon < 1$ now. Without loss of generality, we can suppose that $sum_ia_i=1$ (just to simplify the expressions a bit). Then, for any $lambdage0$, begin{align} mathbb{P}left(sum_ia_iX_i^2leepsilonright)&lemathbb{E}left[e^{lambdaleft(epsilon-sum_ia_iX_i^2right)}right]cr &=e^{lambdaepsilon}prod_imathbb{E}left[e^{-lambda a_iX_i^2}right]cr &=e^{lambdaepsilon}prod_ileft(1+2lambda a_iright)^{-1/2}cr &le e^{lambdaepsilon}left(1+2lambdaright)^{-1/2}. end{align} Take $lambda=(epsilon^{-1}-1)/2$ to obtain $$mathbb{P}left(sum_ia_iX_i^2leepsilonright)le e^{(1-epsilon)/2}sqrt{epsilon}.$$

Correct answer by George Lowther on January 1, 2021

You can directly use Chernoff's inequality, to get $$LHS le e^{-Lambda^*(epsilon)}$$, where $$Lambda^*(epsilon) := (epsilon-1-logepsilon)/2$$ is the Fenchel-Legendre transform of the log-MGF $$Gamma$$ of $$X_1^2$$ (this has been computed in this SE post). Simplifying then gives

$$LHS le e^{-Lambda^*(epsilon)} = e^{-(epsilon-1-logepsilon)/2} = e^{(1-epsilon)/2}sqrt{epsilon} le sqrt{eepsilon}.$$

Answered by dohmatob on January 1, 2021

I also would strongly suspect it holds with $c = 1/2$. Here's an argument which, if I haven't made a mistake, gives the upper bound $O(epsilon^{1/2} ln^{1/2}(1/epsilon))$.

Assume without loss of generality that $a_1 geq a_2 geq cdots geq a_n$ and that $a_1 + cdots + a_n = 1$. Divide into two cases:

Case 1: $a_1 geq frac{1}{100log(1/epsilon)}$. In this case it's sufficient to show that we already have $Pr[a_1 X_1^2 leq epsilon] leq O(epsilon^{1/2} ln^{1/2}(1/epsilon))$. But this holds as you said because of the anticoncentration of a single Gaussian.

Case 2: $a_1 leq frac{1}{100log(1/epsilon)}$. In this case, note that $sigma^2 := sum_i a_i^2 leq a_1 sum_i a_i leq frac{1}{100ln(1/epsilon)}$. Now the random variables $X_i^2$ are nice enough that one should be able to apply a Chernoff Bound to them. (I think I could provide a source for this if necessary.) Since $Y := sum a_i X_i^2$ has mean $1$ and standard deviation $sqrt{2}sigma$ (I think, maybe the constant $sqrt{2}$ is wrong), we should have a statement like $Pr[|Y - 1| geq t cdot sqrt{2}sigma] leq exp(-t^2/c)$, where $c$ is a quite modestly small universal constant. So $Pr[Y leq epsilon] leq Pr[|Y-1| geq 1/2] leq exp(-frac{1}{8csigma^2}) leq exp(-frac{12.5}{c} ln(1/epsilon)) = epsilon^{12.5/c}$, which is smaller than $epsilon^{1/2}$ if $c$ is not too large, and even if $c$ is too large, one can make $100$ larger.

Answered by Ryan O'Donnell on January 1, 2021