联系方式

  • QQ:99515681
  • 邮箱:99515681@qq.com
  • 工作时间:8:00-23:00
  • 微信:codinghelp

您当前位置:首页 >> Algorithm 算法作业Algorithm 算法作业

日期:2019-10-05 10:48

MAST20004 Probability

Assignment Three [Due 4:00 pm Monday 07/10]

There are 5 problems in total, of which 3 randomly chosen ones will be marked.

You are expected to submit answers to all questions, otherwise a mark penalty will

apply. Calculations and reasoning must be given in order to obtain full credit.

Problem 1. Choose a number X at random from the set {1, 2, 3, 4, 5}, then

choose a number Y at random from the subset {1, · · · , X}.

(i) Find the joint pmf of X and Y .

(ii) Find the conditional pmf of X given Y = 3.

(iii) Are X and Y independent?

(iv) Compute the expected value of XY.

Problem 2. Let (X, Y ) be a bivariate random variable whose joint pdf is given

by fX,Y (x, y) = (Cyx3, 0 < x < 1 and 0 < y < x2,0, otherwise.

(i) Compute the constant C, and the marginal pdf’s of X and Y respectively.

(ii) Compute fY |X(y|x) and deduce that

E[Y |X] = 2

(iii) Compute fX|Y (x|y) and E[X|Y ].

Problem 3. Let (X, Y ) be a general bivariate normal random variable.

(i) If Cov(X, Y ) = 0, show that X, Y are independent.

(ii) If Var[X] = Var[Y ], show that X + Y and X − Y are independent.

(iii) Assume that µX = 0, σ2

X = 1, µY = −1, σ2

Y = 4, ρ = 1/2. Compute

P(X + Y > 0) and P(X + Y > 0|2X − Y = 0).

Problem 4. Suppose that X, Y are independent random variables, both uniformly

distributed over [0, 1].

(i) Find the pdf of R = |X − Y | and E[R].

(ii) Find the joint pdf of U = X + Y and V = X/Y .

(iii) Show that the bivariate random variable (U, V ) defined by

U =p−2 log X cos(2πY ), V =p−2 log X sin(2πY )

is a standard bivariate normal random variable with parameter ρ = 0.

[Remark: By taking the U-component, this gives a simple way of generating a

standard normal random variable. To some extent, this method is better than the

one we discussed in lecture using U = Φ−1

(X) where Φ is the Cdf of N(0, 1), since Φ−1

is very hard to obtain. This is a nice illustration of the philosophy that: when

we are working on a one dimensional problem, sometimes it could be substantially

easier if we look at the problem from a multi-dimensional perspective. Another

example of such kind is the computation of R ∞−∞ e−x

2/2dx we did in the lecture.

This idea is further developed and appreciated in the subject of complex analysis.]

(iv) By using a given pair of independent uniform random variables (X, Y ) over

[0, 1], find a way to construct a bivariate random variable (Z, W) satisfying

E[Z] = E[W] = 0, Var[Z] = Var[W] = 5, Cov(Z, W) = 4.

[Remark. Essentially, this method allows us to generate a general bivariate normal

random variable from a pair of independent uniform random variables over [0, 1]]

Problem 5. [Hard] For this problem, you might need to use the following socalled

inclusion-exclusion principle without proof. Let A1, A2, · · · , An be n events.

Then

P(A1 ∪ · · · ∪ An) = Xn

P(Ai ∩ Aj ∩ Ak) − · · · + (−1)n−1P(A1 ∩ · · · ∩ An).

A little girl is painting on a blank paper. Suppose that there is a total number

of N available colors. At each time she selects one color randomly and paints on

the paper. It is possible that she picks a color that she has already used before.

Different selections are assumed to be independent.

(1) Suppose that the littile girl makes n selections.

(1-i) If red and blue are among the available colors, let R (respectively, B) be the

event that her painting contains color red (respectively, blue). What is P(R) and P(R ∪ B)?

(1-ii) Suppose that she is about to make the (n + 1)-th selection. What is the

probability that she will obtain a new color in this selection?

(1-iii) Suppose that n = N. For 1 6 i 6 N, let Ei be the probability that her

painting does not contain color i. By using the inclusion-exclusion principle to

(1-iv) Let D be the number of different colors she obtain among her n selections.

By writing N − D as a sum of Bernoulli random variables, compute E[D] and

Var[D].

(2) Let S be the number of selections needed until every available color has been

selected by the little girl.

(2-i) Find the pmf of S.

(2-ii) For 0 6 i 6 N − 1, let Xi be the random variable that after obtaining i

different colors, the number of extra selections needed until further obtaining a

new color. By understanding the distributions of these Xi

’s and their relationship

with S, show that

Since the harmonic series H(N) = 1 + 1

has logarithmic growth (i.e.H(N)log N → 1 as N → ∞), this result shows that when N is large, on average the

little girl needs to make about N log N selections before obtaining all different

colors.

(3) Let T be the number of selections until the little girl picks a color that she

has obtained before.

(3-i) Find the pmf and expected value of T.

(3-ii) Consider E[T] as a function of N. What is the growth rate of E[T] as

N → ∞? You don’t need to solve this problem mathematically. Simply make an

educated guess.

3


版权所有:留学生编程辅导网 2020 All Rights Reserved 联系方式:QQ:99515681 微信:codinghelp 电子信箱:99515681@qq.com
免责声明:本站部分内容从网络整理而来,只供参考!如有版权问题可联系本站删除。 站长地图

python代写
微信客服:codinghelp