TransWikia.com

Convergence in distribution and independence

Mathematics Asked by AHK on January 19, 2021

Consider two sequences of random variables ${X_n}_{n in mathbb{N}}$ and ${Y_n}_{n in mathbb{N}}$ such that $X_n rightarrow_{d} X$ and $Y_n rightarrow_{d} Y$ and $X_n$ is independent of $Y_n$ for every $n in mathbb{N}$. Does $(X_n, Y_n) rightarrow_d(X,Y)$?

If not can someone suggest a counter example?

One Answer

As @WoolierThanThou mentioned in the comment, the statement could be wrong without assuming the independence of $X$ and $Y$. Here is a counterexample.

Let $X equiv Y sim N(0, 1)$, and $X_n$ be a sequence of $N(0, 1)$ random variables that are independent of $X$, $Y_n$ be a sequence of $N(0, 1)$ random variables that are independent of ${X_n}$ and $X$. Then trivially $X_n to_d X$ and $Y_n to_d Y$.

Consider the set $A = {(x, y): x leq 0, y leq 0}$ with $partial A = {(x, y): x = 0, y leq 0} cup {(x, y): x leq 0, y = 0}$. Since $P[(X, X) in partial A] = 2P[X = 0, X leq 0] - P[X = 0, X = 0] = 0$, $A$ is a continuity set. On the other hand, since $(X_n, Y_n) sim N_2(0, I_{(2)})$, we have begin{align*} P[(X_n, Y_n) in A] = P[X_n leq 0]P[Y_n leq 0] = frac{1}{2}timesfrac{1}{2} = frac{1}{4} notto P[(X, X) in A] = P[X leq 0] = frac{1}{2}, end{align*} showing that $(X_n, Y_n) notto_d (X, X)$.

Answered by Zhanxiong on January 19, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP