Mathematics Asked on January 16, 2021
Let $X_{1}, ldots, X_{n}$ be independent and identically distributed random variables, with uniform distribution in $(0,1)$. Consider $Y_{1}, ldots, Y_{n}$ the associated order statistics and define $$V_{i} = frac{Y_{i}}{ Y_{i + 1}}, i = 1, …, n -1 quad text{and} quad V_{n} = Y_{n}$$Show that $V_{1}, ldots, V_{n}$ are independent and $V_{i}sim textbf{Beta}(i, 1), i = 1, ldots, n$.
How could you prove this result?
My approach: We need two prove two parts, for to prove that $V_{i}$ are independent, we need to show something like $$f_{V_{1},V_{2},ldots,V_{n}}(v_{1},v_{2},ldots,v_{n})=f(v_{1})f(v_{2})cdots f(v_{n})$$
and for the second part we need to show that $V_{i}sim textbf{Beta}(i,1)$, so we need to show that something like $$f(v_{i})=frac{v_{i}^{i-1}(1-v_{i})^{1-1}}{B(i,1)}$$ is PDF of beta distribution.
Also, I think that I can prove that the joint distribution of order statistics for uniform distribution is also a uniform distribution and maybe use that result in the solution of problem.
I think I can use fundamental transformation theorem, but I don’t sure how can I approach this problem.
Here is the idea: You can condition on the ordering of the random variables! Due to symmetry and the fact that $X_1,dots,X_n$ are continuous iid. RVs, it follows that $$textbf{P}(X_1leq X_2leqdotsleq X_n)=frac{1}{n!}.$$ Therefore for any $(x_1,dots,x_n)in (0,1)^n$ it follows that begin{align*}textbf{P}(V_1leq x_1,dots,V_nleq x_n)&=textbf{P}(Y_1leq x_1Y_2,Y_2leq x_2Y_3,dots,Y_nleq x_n)\&=n!textbf{P}(Y_1leq x_1 Y_2,dots,Y_nleq x_n|X_1leqdotsleq X_n)textbf{P}(X_1leqdotsleq X_n)\&=n!textbf{P}(X_1leq x_1 X_2,dots,X_nleq x_n|X_1leqdotsleq X_n)textbf{P}(X_1leqdotsleq X_n)\&=n!textbf{P}(X_1leq x_1 X_2,dots,X_nleq x_n)\&=n!frac{1}{n!}x_1x_2^2cdots x_n^n.end{align*}
In the first equality we have used the fact that $X_1,dots,X_n$ are positive with probability 1. The second follows from symmetry and the law of total probability. The third uses the fact that $Y_i=X_i$ for this specific ordering and the fourth is really interesting: Since $X_1leq x_1X_2$ is known, $X_1leq X_2$ is redundant information! The same follows for the rest. Thus, you have to do some integration to get to the final result, which is exactly the formula you were looking for, since $F_{Y_i}(x)=int^x_0 frac{t^{i-1}}{1/i};dt=x^i$ for any $1leq ileq n$. You can get the marginal CDFs with the same trick. Hope this helped!
Answered by MatheMartin on January 16, 2021
I have a very unrigourous, but simple answer: To show pairwise independence, it should be sufficient to show that knowing the ratio $Y_i/Y_{i+1}leq x_i$ for some $i=1,dots,n+1$ and $x_iin(0,1)$ (here we let $Y_{n+1}equiv 1$ for simplicity) does not tell you whether the ratio $Y_l/Y_{l+1}leq x_l$ for some other $lneq i,l=1,dots,n+1$ and $x_lin(0,1)$.
First of all, we can rewrite both equations to $Y_lleq x_l Y_{l+1}$ and $Y_ileq x_i Y_{i+1}$ since both RVs are positive with probability one. Now if $i<l$, the only thing this tells us is that there is some lower bound for $Y_l$. If $i>l$, we have an upper bound for $Y_{l+1}$. Either way the information is useless. This implies pairwise independence. The same goes if we have several of these equations, so we even have independence.
EDIT: I'm pretty sure the argument above does not work at all. I just let it there in the case that it helps someone find an argument that works.
Once you have independence, the rest follows easily from $textbf{P}(Y_nleq x_n)=x_n^n$ and $textbf{P}(Y_{i}leq x_i Y_{i+1})=textbf{E}[textbf{P}(Y_ileq x_i Y_{i+1}|Y_{i+1})]=textbf{E}[F_{Y_i}(x_iY_{i+1})]$ (the last equality follows from independence) where $F_{Y_i}$ is the CDF of $Y_i$.
Disclaimer: I'm really unsure about this solution, but it makes a lot of sense to me. It would be great if someone could double check.
Answered by MatheMartin on January 16, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP