MathOverflow Asked by B Merlot on January 7, 2021
I am trying to get compute at least the directional component of the following expectation, where $M$ is a symmetric, invertible, PD matrix:
$$mathbb{E}_{v sim N(0, I)}left[frac{vv^T}{||Mv||_2}right]$$
(Note that the norm is not squared). My approach so far has been to write this as
begin{align}
&mathbb{E}_{v sim N(0, M)}left[frac{M^{-1/2}vv^TM^{-1/2}}{||M^{1/2}v||_2^2}|M^{1/2}v|_2right] \
&= M^{-1/2}cdot mathbb{E}_{v sim N(0, M)}left[frac{vv^T}{||v||_M^2}|v|_Mright]cdot M^{-1/2} \
&= M^{-1/2}cdot mathbb{E}_{v sim N(0, M)}left[
left(frac{v}{||v||_M}right)left(frac{v}{||v||_M}right)^top |v|_Mright]cdot M^{-1/2}
end{align}
Then, it seems as though (and supported by this answer) the first two components in the expectation should be independent of the last one, and since the norm is a scalar and the expectation of the normalized outer product is $I$, the directional component of this expectation is simply M^{-1}.
—- Edit to make my approach more clear —-
Going based on this answer, we can write the aforementioned expectation as
begin{align}
&= M^{-1/2}cdot mathbb{E}_{v sim N(0, M)}left[
left(frac{v}{||v||_M}right)left(frac{v}{||v||_M}right)^topright]
mathbb{E}_{v sim N(0, M)}left[ |v|_Mright]cdot M^{-1/2}
end{align}
Since I don’t care about the scaling factor, and rather only the “direction” of the resulting matrix, I am willing to omit the $mathbb{E}_{v sim N(0, M)}left[ |v|_Mright]$ term. The other expectation term should just be identity? Leaving us with $M^{-1}cdot C$ for some scalar $C$.
Is this correct? And if so, is there a better/more elegant way to solve this problem?
I am limiting initially to the trace of the matrix in the OP.
First of all, because of the isotropy of the distribution of the vector $v$, you may work in a basis where $M$ is diagonal, $M=text{diag},(mu_1,mu_2,ldotsmu_n)$. Then the expectation value you seek is $$I({mu_i})=sum_{k=1}^n mathbb{E}left[frac{v_{k}^2}{(mu_k^2 v_k^2+Q_k)^{1/2}}right],;;text{with};;Q_k=sum_{jneq k}mu_{j}^2 v_{j}^2.$$ The new variable $Q_k$ is independent of $v_k$, but its distribution is cumbersome if the $mu_j$'s are all different. An approximation by a chi-squared distribution using the Welch–Satterthwaite formula may be useful.
Even for $n=2$ an exact answer involves special functions (elliptic integrals $K$ and $E$ of the first and second kind): $$I(mu_1,mu_2)=sqrt{frac{2}{pi }},frac{1}{{mu_1 mu_2 left(mu_1^2-mu_2^2right)}}left[mu_1^3 Kleft(1-frac{mu_1^2}{mu_2^2}right)-mu_2^3 Kleft(1-frac{mu_2^2}{mu_1^2}right)+mu_1^2 mu_2 Eleft(1-frac{mu_2^2}{mu_1^2}right)-mu_1 mu_2^2 Eleft(1-frac{mu_1^2}{mu_2^2}right)right]$$
Instead of the trace, I might consider the individual components
$$I_k({mu_i})= mathbb{E}left[frac{v_{k}^2}{(mu_k^2 v_k^2+Q_k)^{1/2}}right]$$
For $n=2$ I find $$frac{I_1}{I_2}=frac{mu_1^2 mu_2 Eleft(1-frac{mu_2^2}{mu_1^2}right)-mu_2^3 Kleft(1-frac{mu_2^2}{mu_1^2}right)}{mu_1^3 Kleft(1-frac{mu_1^2}{mu_2^2}right)-mu_1 mu_2^2 Eleft(1-frac{mu_1^2}{mu_2^2}right)}.$$ The direction $arctan(I_2/I_1)$ of the vector $mathbf{I}=(I_1,I_2)$ remains a complicated function of the ratio $mu_1/mu_2$, none of the simplifications suggested in the OP seem to appear.
Answered by Carlo Beenakker on January 7, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP