TransWikia.com

Variance of a stationary AR(2) model

Cross Validated Asked by user369210 on January 26, 2021

I have two questions:

1) When one says an ARMA process is ‘stationary,’ do they mean strongly stationary or weakly stationary?

2) Is there a quick way to find the variance of a stationary AR(2) model $$y_t = beta_1 y_{t-1} + beta_2 y_{t-2} + epsilon_t?$$ The only way I can think of doing this is by multiplying by $y_t$, $y_{t-1}$ and $y_{t-2}$, taking expectations, and solving the Yule-Walker system with 3 equations and 3 unknowns. The trick for AR(1) models, where one takes expectations of both sides, doesn’t slide here because you get a $mathrm{Cov}(y_{t-1}, y_{t-2})$ term.

2 Answers

Stationarity

Considering your AR(2) process with mean zero i.i.d. noise $epsilon_t$ of variance $sigma_epsilon^2$,

$$ y_t = beta_1 y_{t-1} + beta_2 y_{t-2} + epsilon_t ,,,,,,,,,,,,,,,,,(*) $$

we can rewrite it in terms of the lag operator $L$

$$ (1 - beta_1 L - beta_2 L^2)y_t = epsilon_t $$

so that we have a new operator

$$ 1 - beta_1 L - beta_2 L^2 $$

If the above were a polynomial in $L$, let its roots be $z_1^{-1}, z_2^{-1}$. We call the polynomial the characteristic polynomial, and $z_1, z_2$ its factors. (Note the factors and roots are inverses of each other, and may be complex.)

It can be shown that the AR(2) is stationary when $|z_1| < 1$ and $|z_2|<1$. i.e. when all of the following are met: $$ |beta_2| < 1 \ beta_2 + beta_1 < 1 \ beta_2 - beta_1 < 1 $$

For details, see this answer. Or in terms of the more general ARMA(p,q) process, see Introduction to Time Series and Forecasting. Brockwell and Davis. 2016. p 74.

Variance

If the process is stationary, we can write the covariance as a function of increment alone, so let the covariance function $gamma(k) doteq E[y_t y_{t+k}]$. We can find the variance $gamma(0)$ by squaring and taking the expectation of both sides of equation $(*)$ with the following result

$$ gamma(0) = beta_1^2 gamma(0) + beta_2^2 gamma(0) + 2 beta_1 beta_2 gamma(1) + sigma_epsilon^2 ,,,,,,,,,,,,,,,,,(**) $$

Starting again from equation $(*)$, this time multiply both sides by $y_{t-k}$ and again take the expectation

$$ gamma(k) = beta_1 gamma(k-1) + beta_2 gamma(k-2) $$

Usefully, we now have an expression for covariance, but we just need it to compute $gamma(1)$. Since $gamma(-1) = gamma(1)$ (examine the definition, note that covariance is not a function of $t$), we can let $k=1$ in the above equation and get

$$ gamma(1) = frac{beta_1 gamma(0)}{1-beta_2} $$

Substituting into $(**)$, we get the variance

$$ text{Var}(y_t) = gamma(0) = frac{(1-beta_2)sigma_epsilon^2}{(1+beta_2)(1 - beta_1 - beta_2)(1 + beta_1 - beta_2)} $$

As a sanity check on our stationarity conditions on $beta_1, beta_2$ described earlier, those conditions are equivalent to the conditions which make our expression for $text{Var}(y_t)$ positive

The above is essentially a summary and rearrangement of some parts of these notes.

Strong or Weak Stationarity

I have only seen arguments of AR process stationarity in terms of fixed mean and variance, so weak stationarity is implied. However, if the stationary distribution can be characterized completely in terms of those first and second moments, then we also have strong stationarity. See A unified view of linear AR(1) models. G.K. Grunwald. 1996.

As an example, if we have an AR(1) process and $epsilon_t$ is white noise (i.e. Gaussian), then the process' stationary distribution is also Gaussian. Since the Gaussian is fully specified by its first two moments, we have strong stationarity in that case. I am unsure whether or not this also applies to more general AR(p) Gaussian processes, or to AR(p) processes with other kinds of i.i.d. noise.

Answered by kdbanman on January 26, 2021

Stack it, i.e. write as VAR(1) for the vector x(t) = [y(t),y(t-1)]', x(t) = [beta1,beta2;1,0] x(t-1) + [epsilon(t),0]'. Vectorize the resulting matrix equation and exploit vec(ABC)=(C' kron A) vec(B)

Answered by Harald Uhlig on January 26, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP