Sale!

Problem Set 14 Balls in Bins Estimation

$30.00

Rate this product

EECS 126: Probability and Random Processes
Problem Set 14 (Optional)

1. Balls in Bins Estimation
We throw n ≥ 1 balls into m ≥ 2 bins. Let X and Y represent the number of balls that land
in bin 1 and 2 respectively.
(a) Calculate E[Y | X].
(b) What are L[Y | X] and Q[Y | X] (where Q[Y | X] is the best quadratic estimator of Y
given X)?
Hint: Your justification should be no more than two or three sentences, no calculations
necessary! Think carefully about the meaning of the MMSE.
(c) Unfortunately, your friend is not convinced by your answer to the previous part. Compute
E[X] and E[Y ].
(d) Compute var(X).
(e) Compute cov(X, Y ).
(f) Compute L[Y | X] using the formula. Ensure that your answer is the same as your
answer to part (b).
2. MMSE and Conditional Expectation
Let X, Y1, . . . , Yn be square integrable random variables. The MMSE of X given (Y1, . . . , Yn)
is defined as the function φ(Y1, . . . , Yn) which minimizes the mean square error
E[(X − φ(Y1, . . . , Yn))2
].
(a) For this part, assume n = 1. Show that the MMSE is precisely the conditional expectation
E[X|Y ]. Hint: expand the difference as (X − E[X|Y ] + E[X|Y ] − φ(Y )).
(b) Argue that
E

(X − E[X | Y1, . . . , Yn])2

≤ E
X −
1
n
Xn
i=1
E[X | Yi
]
2
.
That is, the MMSE does better than the average of the individual estimates given each
Yi
.
3. Geometric MMSE
Let N be a geometric random variable with parameter 1 − p, and (Xi)i∈N be i.i.d. exponential
random variables with parameter λ. Let T = X1 + · · · + XN . Compute the LLSE and MMSE
of N given T.
Hint: Compute the MMSE first.
1
4. Gaussian Random Vector MMSE
Let

X
Y

∼ N 1
0

,

2 1
1 2
be a Gaussian random vector.
Let
W =



1, if Y > 0
0, if Y = 0
−1, if Y < 0
be the sign of Y . Find E[W X | Y ]. Is the LLSE the same as the MMSE?
5. Gaussian Sine
Let X, Y, Z be jointly Gaussian random variables with covariance matrix


4 1 0
1 4 1
0 1 4


and mean vector [0, 2, 0]. Compute E[(sin X)Y (sinZ)]. Hint: Condition on (X, Z).
6. Error of the Kalman Filter for a Linear Stochastic System
The linear stochastic system

X1,k+1
X2,k+1
=

2 1
1 2 X1,k
X2,k
+

1
−1

wk, k ≥ 0,
starts from an arbitrary (known) initial condition 
x1,0
x2,0

and the system noise variables
(wk, k ≥ 0) are i.i.d. normal with mean 0 and variance 1.
The state variables are not directly observable. However, we can observe
Yk = X1,k + X2,k, k ≥ 0.
Let Xˆ
k|k denote the minimum mean square error estimator of Xk =

X1,k
X2,k
given (Y0, . . . , Yk).
Determine the asymptotic behavior of the covariance matrix of the estimation error.
Note: This problem needs thought. Note that there is no observation noise, so the assumption
used in the derivation of the Kalman filter equations, that the covariance matrix of the
observation noise is positive definite, is no longer valid.
2

Scroll to Top