MATH38032 Time Series Analysis
Examples sheet 2
1. Consider a stationary time series {xt} with mean 0 and suppose we want to predict xt using the infinite past xt−1 , xt−2 , . . . .
(a) How do we construct a linear predictor of xt?
(b) In what sense is a linear predictor of xt the best?
(c) What is the necessary and sufficient condition for a linear predictor xt to be the best?
(d) Do the coefficients of the best linear predictor xt change with t?
(e) What if only a finite past xt−1 ,. . . , xt−k is available? Answer (a)-(d) above. (f) What if the time series has a non-zero mean µ? Answer (a) above.
2. (a) What is the partial autocorrelation at lag k ≥ 1?
(b) How is the partial autocorrelation at lag k > 1 different from the autocorrelation r (k)?
(c) How do we write the partial autocorrelation φkk as a correlation? Explain the notation.
(d) Give an example of a stationary time series with partial autocorrelation φkk = 0 for all k ≥ 1. (e) What are the Yule-Walker equations for φki , i = 1,..., k?
(f) How does the Durbin-Levinson algorithm solve the Y-W equation?
3. Let
be the best linear predictor of xt. Show that
is the best linear ‘predictor’ of xt using future values xt+1 , xt+2 ,. . . . [Hint: use the orthogonality condition in Theorem 3.]
4. Find the partial autocorrelation function of the time series given by
(a)
(b) xt = ε t+ θε t−1 , t ∈ Z,
where |φ| < 1 and {ε t} ~ WN(0, σ2 ). [Hint: solve the Y-W eqns for k = 1 and k = 2 using autocovariances from sheet 1. Then for (a),after finding xt(k) = xt(1) when k = 2, use the orthogonality condition to show that it is also true for k > 2. Part (b) is harder, you can stop at k = 3. Answers: (a) φ11 = a and φkk = 0, k > 1. (b) φkk= −(−θ)k (1− θ2)/(1− θ2k+2), k ≥ 1.]
版权所有:留学生编程辅导网 2020 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。