Homework 15, Due Friday March 14, 2008. ======================================= In this exercise, you are to code a gradient search with the added wrinkle, discussed in class, that the distance you go in the gradient direction -- as an iterative step in maxmimizing a function -- is itself found by a univariate maximization. Your task is to write an R function with the inputs: ffcn : a scalar function of a vector parameter fprim : the analytical gradient of ffcn with respect to its vector argument (ie with p-dimensional vector value when the argument of ffcn is p dimensional . xzer : a starting value for x. step : the maximum step length grdtol: the (small) real number to be used as a stop criterion in your iteration. iter : the maximum number of allowed iterations Your R function should iterate for up to iter steps, stopping earlier if fprim(x_k) ever has norm as small as grdtol . The outputs should be: your optimized (or final) x_k value and its fprim value and (optionally) a numerical approximation to the Hessian matrix of ffcn at x_k; also include in the output list your starting value and the number of iteration-steps actually used. Show that your code works to maximize a nontrivial function with 2-dimensional argument, and on another with 5-dimensional argument. Both times, the function could be (but need not) be defined as log-likelihood from a statistical regression problem, if you like, but then take the error density to be something like logistic or t or the log of a Weibull r.v., not normal.