Cross Validated Asked by chasmani on November 6, 2021
If I have a linear model of the form:
$$x_i = beta y_i + alpha + epsilon_i$$
where $epsilon_i$ are samples from $epsilon$, an independent and identically distributed random variable. I can find estimates $hat{beta}, hat{alpha}$ using ordinary least squares.
Given a value of $y=y_0$, can I write
$$p(x|y_0) = hat{beta} y_0 + hat{alpha} + p(epsilon)$$
Answering my own question:
No, you cannot write that. You can however consider $x|y_0$ as a translation of the random variable $epsilon$. So
$$x|y_0 = hat{beta} y_0 + hat{alpha} + epsilon$$
In expectation:
$$<x|y_0> = hat{beta} y_0 + hat{alpha} + <epsilon>$$
And the higher moments will be the same as the moments of the noise distribution.
So $p(x|y_0)$ will have the same shape as $p(epsilon)$, but shifted along the axis by $hat{beta} y_0 + hat{alpha}$.
Answered by chasmani on November 6, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP