Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Math 281A Homework 6: Maximum Likelihood Estimation and Kullback-Leibler Divergence, Assignments of Mathematics

probability and statistical inference

Typology: Assignments

2021/2022

Uploaded on 01/21/2023

Shashi123
Shashi123 🇮🇳

5 documents

1 / 1

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Math 281A Homework 6
Due: Nov 21, in class
1. Let {Xi, Yi}n
i=1be i.i.d. random vectors with Yi{0,1}, and
Pα,β(Yi=1Xi=x)=1
1+eαβx .
The distribution of Xiis non-degenerate, but unknown. Do we have closed form of MLE (ˆα, ˆ
β)? Derive
the asymptotic distribution of (ˆα, ˆ
β).
2. Let {Xi}n
i=1be i.i.d. from Poisson(1/θ).
(a) Calculate the Fisher information Iθin one observation;
(b) Derive the MLE ˆ
θand show its asymptotic distribution.
3. Let {Xi}n
i=1be i.i.d. from N(θ, θ ).
(a) Calculate the Fisher information Iθin one observation;
(b) Derive the MLE ˆ
θand show its asymptotic distribution.
4. (a) Calculate the Kullback-Leibler divergence between two exponential distributions with different
scale parameters, when is it maximal?
(b) Calculate the Kullback-Leibler divergence between two normal distributions with different location
and scale parameters, when is it maximal?

Partial preview of the text

Download Math 281A Homework 6: Maximum Likelihood Estimation and Kullback-Leibler Divergence and more Assignments Mathematics in PDF only on Docsity!

Math 281A Homework 6

Due: Nov 21, in class

  1. Let {Xi, Yi}

n

i= 1

be i.i.d. random vectors with Yi ∈ { 0 , 1 }, and

P

α,β

(Y

i

= 1 SX

i

= x) =

1 + e

−α−βx

The distribution of Xi is non-degenerate, but unknown. Do we have closed form of MLE (α,ˆ

β)? Derive

the asymptotic distribution of (α,ˆ

β).

  1. Let {Xi}

n

i= 1

be i.i.d. from Poisson( 1 ~θ).

(a) Calculate the Fisher information Iθ in one observation;

(b) Derive the MLE

θ and show its asymptotic distribution.

  1. Let {Xi}

n

i= 1

be i.i.d. from N (θ, θ).

(a) Calculate the Fisher information Iθ in one observation;

(b) Derive the MLE

θ and show its asymptotic distribution.

  1. (a) Calculate the Kullback-Leibler divergence between two exponential distributions with different

scale parameters, when is it maximal?

(b) Calculate the Kullback-Leibler divergence between two normal distributions with different location

and scale parameters, when is it maximal?