联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-23:00
  • 微信:codinghelp

您当前位置:首页 >> Python编程Python编程

日期:2019-11-27 09:01

THE UNIVERSITY OF HONG KONG

DEPARTMENT OF STATISTICS AND ACTUARIAL SCIENCE

STAT6011/7611/8305 COMPUTATIONAL STATISTICS

(2019 Fall)

Assignment 3, due on November 28

All numerical computation MUST be conducted in Python, and attach

the Python code.

1. Consider an integral.

(a) Plot the above integrand function in the range of (?2, 5).

(b) Use the Gaussian Legendre, Chebyshev 1, Chebyshev 2, and Jacobi quadratures

with 10 nodes and weights to approximate the integral respectively.

Present the nodes, weights, and the approximation results.

2. Use the dataset q2.csv, the observed data y = (y1, . . . , yn) are from a mixture

of normal distributions, i.e., Yi ~

Pk

j=1 ωjfj (y), i = 1, . . . , n = 1000, where

each fj

is a normal density function N(μj, σ2j), and ωj

is the mixing probability

and Pk

j=1 ωj = 1. Consider the complete data (yi, ui), where the missing data

ui

indicates which distribution yi

is from.

(a) Write out the complete-data likelihood.

(b) Derive the marginal distribution of yi.

(c) Suppose that we know k = 2, σ21 = σ2

2 = 1 (j = 1, 2), and ω1 = ω2 = 0.5,

but μj’s are unknown. Derive the Q(μ|μ

(0)) function in the E step, and

derive the estimators {μ(1)j} given the previous step values {μ(0)j} in the

M step. Use the (sample-based) EM algorithm to estimate μj.

(d) Repeat (c) using population EM, i.e., taking the expectation of Yi based

on its true mixture density function f(y) = 0.5N(?1, 1) + 0.5N(1, 1)

where the expectation can be computed using Monte Carlo (MCEM).

Comment on results in (c) and (d).

(e) Suppose that we know k = 2, σ21 = σ22 = 1 (j = 1, 2), but μj and ωj are unknown.

If we treat the ui’s as missing data, derive the Q(ω, μ|ω(0), μ(0))

function in the E step, and derive the estimators in a closed form, i.e.,

the iterative equation between {ω(1)j, μ(1)j} and {ω(0)j, μ(0)j} in the M step.

Use the (sample-based) EM algorithm to estimate μj and ωj.

(f) Repeat (e) using population EM, i.e., taking the expectation of Yi based

on its true mixture density function f(y) = ω1N(?1, 1)+ω2N(1, 1) where

the expectation can be computed using Gaussian Hermite quadrature.

1

Comment on results in (c)–(f), i.e., knowing the true weights helps convergence

or not (e.g., how many iterations are needed for convergence).

3. Use the EM algorithm to estimate the parameters in the random effects logistic

model, for i = 1, . . . , I and j = 1, . . . , J,

Yij = β0 + β1xij + ui + ij ,

The unknown parameter vector θ = (β0, β1, σ2u, σ2)

T.

(a) Write out the complete-data likelihood.

(b) Derive the Q-function and the M-step of the EM algorithm.

(c) Conduct simulations as follows. Set the parameters β0 = 0.5, β1 = 1,

σu = 1, σ= 1, I = 100, and J = 2. For each dataset, simulate xij from

Uniform(0, 1), simulate ij and ui

from the corresponding normal distributions,

and then obtain yij . Use the EM algorithm to obtain the parameter

estimates based on each simulated dataset. Repeat the simulation

process 1000 times and present the bias (averaged over 1000 simulations)

and standard deviation for θ. Comment on your findings.

2


版权所有:留学生编程辅导网 2020 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp