Sale!

Problem Set 12 Flipping Coins and Hypothesizing

Original price was: $35.00.Current price is: $30.00.

Rate this product

EECS 126: Probability and Random Processes
Problem Set 12

1. Flipping Coins and Hypothesizing
You flip a coin until you see heads. Let
X =
(
1 if the bias of the coin is q > p.
0 if the bias of the coin is p.
Find a decision rule Xˆ(Y ) that maximizes P[Xˆ = 1 | X = 1] subject to P[Xˆ = 1 | X = 0] ≤ β
for β ∈ [0, 1]. Remember to calculate the randomization constant γ.
2. Gaussian Hypothesis Testing
Consider a hypothesis testing problem that if X = 0, you observe a sample of N (µ0, σ2
),
and if X = 1, you observe a sample of N (µ1, σ2
), where µ0, µ1 ∈ R, σ
2 > 0. Find the
Neyman-Pearson test for false alarm α ∈ (0, 1), that is, P(Xˆ = 1 | X = 0) ≤ α.
3. BSC Hypothesis Testing
Consider a BSC with some error probability  ∈ [0.1, 0.5). Given n inputs and outputs (xi
, yi)
of the BSC, solve a hypothesis problem to detect that  > 0.1 with a probability of false alarm
at most equal to 0.05. Assume that n is very large and use the CLT.
Hint: The null hypothesis is  = 0.1. The alternate hypothesis is  > 0.1, which is a composite
hypothesis (this means that under the alternate hypothesis, the probability distribution of
the observation is not completely determined; compare this to a simple hypothesis such as
 = 0.3, which does completely determine the probability distribution of the observation). The
Neyman-Pearson Lemma we learned in class applies for the case of a simple null hypothesis
and a simple alternate hypothesis, so it does not directly apply here.
To fix this, fix some specific 
0 > 0.1 and use the Neyman-Pearson Lemma to find the optimal
hypothesis test for the hypotheses  = 0.1 vs.  = 
0
. Then, argue that the optimal decision
rule does not depend on the specific choice of 
0
; thus, the decision rule you derive will be
simultaneously optimal for testing  = 0.1 vs.  = 
0
for all 
0 > 0.1.
4. Basic Properties of Jointly Gaussian Random Variables
Let (X1, . . . , Xn) be a collection of jointly Gaussian random variables. Their joint density is
given by (for x ∈ R
n
)
f(x) = 1
p
(2π)
ndet(C)
exp 

1
2
(x − µ)
T C
−1
(x − µ)

,
where µ is the mean vector and C is the covariance matrix.
(a) Show that X1, . . . , Xn are independent if and only if they are pairwise uncorrelated.
1
(b) Show that any linear combination of these random variables will also be a Gaussian
random variable.
5. Independent Gaussians
Let X = (X, Y ) be a jointly Gaussian random vector with mean vector [0, 0] and covariance
matrix

2 1
1 2
Find a 2 × 2 matrix U such that UX = (X0
, Y 0
) where X0 and Y
0 are independent.
2

Scroll to Top