Cross Validated Asked by mcgurck on February 25, 2021
I’m working through the book Introductory Econometrics and stumbled across a statement regarding the variance of the error term, $u$, of a linear regression model, $y = beta_0 + beta_1 x + u$.
To give some context, two assumptions were introduced beforehand:
Then, the argument goes on as follows:
Because $Var(u|x) = E(u^2|x) – [E(u|x)]^2$ and $E(u|x) = 0$, $sigma^2 = E(u^2|x)$, which means $sigma^2$ is also the unconditional expectation of $u^2$.
While I understand the first part of the sentence, I have no idea where the bolded part comes from. It seems to say that because $E(u^2|x)=sigma^2$ (i.e. the conditional expectation of $u^2$), it follows that $E(u^2) = sigma^2$ (i.e. the unconditional expectation of $u^2$).
I might be missing something very basic here, but I can’t figure it out.
It follows from the law of iterated expectations: the expected value of the conditional expected value of $u$ given $X$ is the same as the expected value of $u$.
$$E[u^2] = E[E[u^2|X]] = E[sigma^2] = sigma^2$$
Correct answer by Ale on February 25, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP