Homework 12, Assigned 10/19/17, due Monday 10/30, 6pm ===================================================== 14 points EM Algorithm for Estimating a Mixture Distribution Data values X_1,...,X_n are nonnegative independent random variables which have probability density functions of the form alph* dnorm(x,mu, sig1) + (1-alph)* dnorm(x,mu,sig2) where alph is in (0,1), mu is in (-Inf,Inf) and 0 < sig1 < sig2 are unknown parameters. This means that each X_i can be viewed as depending on on unobserved "group-label" variable G_i = 0, 1 with P(G_i=1)=alph, X_i ~ N(mu,sig1) given G_i=1 and X_i ~ N(mu,sig2) given G_i=0. Generate a dataset of size n=500 under this model following set.seed(7799), and alph=.3, mu = 10, sig1=2 and sig2=3 Find the maximum likelihood estimates for (alph,mu, sig1,sig2) in two ways: (a) with a straightforward likelihood maximization, and (b) using the EM algorithm. In both cases use (.5, 15, 3, 5) as starting values. In part (b), you can to express the EM iterative step (including the conditional expectations and the maximization) as a function of the parameters and data, in the following form: first you can find the new alph parameter in closed form, and then sig1 and sig2 as closed-form functions of mu after which maximization in mu leads to a cubic polynomial in mu that can be solved uniquely via uniroot. Compare the number of iterations it takes you to converge n parts (a) and (b).