Sale!

# Estimation Problems-  Uniform Estimators

\$30.00

Category:

CS 536 : Estimation Problems
Uniform Estimators
Let X1, X2, . . . , Xn be i.i.d. random variables, uniformly distributed on [0, L] (i.e., with density 1/L on this interval).
In the posted notes on estimation, it is shown that the method of moments and maximum likelihood estimators for
L are given by
LˆMOM = 2Xn
LˆMLE = max
i=1,…,n
Xi
.
(1)
We want to consider the question of which estimator is better. Recall the definition of the mean squared error of an
estimator as
MSE(Lˆ) = E

Lˆ − L
2

(2)
Note: the answers to homework zero may also be useful here.
1) Show that in general, MSE(ˆθ) = bias(ˆθ)
2 + var(ˆθ), where var is the variance, and bias is given by
bias(ˆθ) = θ − E
h
ˆθ
i
. (3)
2) Show that LˆMOM is unbiased, but that LˆMLE has bias. In general, LˆMLE consistently underestimates L – why?
3) Compute the variance of LˆMOM and LˆMLE.
4) Which one is the better estimator, i.e., which one has the smaller mean squared error?
5) Experimentally verify your computations in the following way: Taking n = 100 and L = 10,
– For j = 1, . . . , 1000:
– Simulate X
j
1
, . . . , Xj
n and compute values for Lˆj
MOM and Lˆj
MLE
– For n = 100, L = 10, simulate X1, . . . , Xn, and compute values for LˆMOM and Lˆj
MLE.
– Estimate the mean squared error for each population of estimator values.
– How do these estimated MSEs compare to your theoretical MSEs?
6) You should have shown that LˆMLE, while biased, has a smaller error over all. Why? The mathematical
justification for it is above, but is there an explanation for this?
7) Find P

LˆMLE < L − 

as a function of L, , n. Estimate how many samples I would need to be sure that my
estimate was within  with probability at least δ.
8) Show that
Lˆ =

n
n − 1

max
i=1,…,n
Xi
, (4)
is an unbiased estimator, and has a smaller MSE still.
1

## Reviews

There are no reviews yet.