
”
Math 234作业 写作、 辅导c/c++语言作业、 写作Java,Python程序设计作业
Math 234 Section 1
Homework 6 – Spring 2020
Asymptotics
Due: Mar 26, 2020 11:59 pm.
Instructions: Please write neat solutions for the problems below. Show all your work. If necessary,
explain your solution in words. If you only write the answer with no work, you may not be given
any credit.
Please submit your entire homework as a single pdf file. Use pdf merging tools as necessary. For
problems 1-3, you can scan your responses using a scanner or a phone scanning app. You are not
allowed to simply take a photo of your homework due to poor lighting. The grader reserves the
right not to grade your submission if it is unclear.
Problems
1. Suppose that X1, . . . , Xn are iid with common density function
f(x | ) = 1
2
(1 + x), 1 x 1
for 1 1. Find a consistent estimator of . Justify that the estimator is consistent. Hint: Find the
MOM estimator.
2. Let X1, . . . , Xn be iid with mean and variance
2 . Let g be a function such that g
0() = 0 and
whose second derivative g
00 is continuous with g
00() 6= 0. Recall that Xn =1nPni=1 Xi.
(a) Show that n|g(Xn)g()| 0 in distribution as n . Note that this is different from n(g(Xn)g()).
(b) Show that n|g(Xn) g()| 0 in probability as n .
(c) Show that n[g(Xn) g()] converges in distribution to 12g00()221 as n .
(d) Use the previous result to show that when = 0.5,n[Xn(1 Xn) (1 )] 221 in distribution as n .
Hint: For part a), mimic the proof of the Delta method showed in class, i.e. start by using the Mean
Value Theorem. The Central Limit Theorem and Slutskys Theorem (Thm 5.5, p. 75 of Wasserman) may
be useful. For part b), look at Thm 5.4c of Wasserman (this was also discussed in lecture). For part c),
construct a 2nd-order Taylor series expansion of g at = X
n. You can ignore the higher-ordered terms
in the Taylor series expansion for this problem. CLT and Slutskys theorem may be handy in completing
your proof.
3. Continuation of Problem 3 of Homework 5. Suppose that X1, . . . , Xn are iid with distribution U(0, ),
0 is an unknown parameter. In class, we showed that n = max(X1, . . . , Xn) is the MLE. Perform the
following.
(a) Show that n is consistent. You may freely quote the results derived in the previous homework.
(b) Show that the limiting distribution of n(
n ) as n is exponential. Hint: you may need to
refresh on Lhopitals rule for evaluating limits.
(c) Give an approximate value for Var(n) for large n.
(d) For this example, we obtain that the MLE is asymptotically exponential instead of asymtptotically
normal as we demonstrated in lecture. Briefly explain why this does not contradict what we established
in class. Hint: Look at the sketch of the proof for asymptotic normality of MLE. We took the derivative
of the log likelihood function with respect to . Can you do that here?
4. R exercise. The objective of this exercise is to learn how to use R to perform bootstrap.
(a) Read p. 187-190 of the Intro to Statistical Learning book. This is identical to the lecture on bootstrapping
but with more details.
(b) Read p. 194-195 of the book. Only read the section Estimating the Accuracy of a Statistic of
Interest. Skip the part on the linear regression model.
(c) Work on (a), (b), (c) of Problem 9, p. 201. For part (b), an estimate of the standard error is given
by s
n
where s is the (observed) sample standard deviation. For part (c), generate at least 1000
bootstrap samples. In addition, plot a histogram of the bootstrap samples you generated. This link
on plotting histograms may be helpful. Include your responses to a pdf file that you must submit to
NYU classes.
“
添加老师微信回复‘’官网 辅导‘’获取专业老师帮助,或点击联系老师1对1在线指导。







