Differences

This shows you the differences between two versions of the page.

Link to this comparison view

cs-677sp2010:mom-and-mle [2014/12/12 20:32] (current)
ryancha created
Line 1: Line 1:
 += Still Proofreading,​ but this is essentially correct... =
  
 += Expectations =
 +
 +== Expected value of the sum ==
 +
 +If $X_1,  ...  ,X_n$ are n independent random variables such that the expectation $E[X_i]$ exists for all $i=1..n$ then prove:
 +
 +$E[X_1 + X_2 + ... + X_n]  =  E[X_1] + E[X_2] + ... + E[X_n]$
 +
 +Now, If $X_1,  ...  ,X_n$ are n (not necessarily independent) random variables such that the expectation $E[X_i]$ exists for all $i=1..n$ then prove:
 +
 +$E[X_1 + X_2 + ... + X_n]  =  E[X_1] + E[X_2] + ... + E[X_n]$
 +
 +Humm, the expectation of the sum is the sum of the expectations,​ even if they are NOT independent!
 +
 +== Expectations with independence ==
 +
 +If $X$ and $Y$ are independent random variables show that:
 +
 +$E_{p(X,​Y)}[f(X) ]=E_{p(X)}[f(X) ]$
 +
 +In this case I am using the subscript to indicate the distribution with respect to which the expectation is taken
 +
 +== Messy expectations ==
 +
 +Prove that
 +
 +$exp \lbrace E_{p(x)}[ln(g(y)^{f(x)})] \rbrace =g(y)^{E_{p(x)}[f(x)]}$
 +
 +This is only about four or five lines, so if you are going on much longer, you have made this harder than it was intended.
 +
 +== Markov'​s Inequality ==
 +
 +Prove that 
 +
 +$Pr(X \ge t) \le E_{p(x)}[X]/​t$
 +
 +assuming that $Pr(X \ge 0) = 1$ and that $t > 0$.
 +
 +== Variance ==
 +
 +If $X_1,  ...  ,X_n$ are n independent random variables such that the variance $Var[X_i]$ exists for all $i=1..n$ then prove:
 +
 +$Var[X_1 + X_2 + ... + X_n]  =  Var[X_1] + Var[X_2] + ... + Var[X_n]$
 +
 +== Calculating Moments ==
 +
 +$f(x; \alpha, \beta, \gamma) = \begin{cases} c & \mathrm{if}\,​ \alpha < x \le \beta, \\ 2c & \mathrm{if}\,​ \beta < x \le \gamma, \\ 0 & \mathrm{otherwise}. \end{cases}$
 +
 +Calculate $c$ (the constant of integration),​ and then calculate the first, second, and third moments about zero, in terms of $\alpha$, $\beta$, and $\gamma$.
 +
 += Method of Moments=
 +
 +== Negative Binomial ==
 +
 +=== Part 1 ===
 +
 +The Negative Binomial distribution can be parameterized as follows:
 +
 +$f(x; \alpha, \beta) = {x + \alpha - 1 \choose \alpha - 1} \left(\frac{\beta}{\beta+1}\right)^\alpha \left(\frac{1}{\beta+1}\right)^x,​\,​ x=0, 1, 2, \ldots$
 +
 +For common distributions,​ the moments are well known and appear in tables of distributions. ​ In this case, the mean is:
 +
 +$\mu = E[X] = \frac{\alpha}{\beta}$
 +
 +and the variance is:
 +
 +$\sigma^2 = E[(X-E[X])^2] = \frac{\alpha}{\beta^2}(\beta + 1)$
 +
 +Find a formula for $\alpha$ and $\beta$ using Method of Moments.
 +
 +=== Part 2 ===
 +
 +Suppose that you are given the first moment ​ and  the second central moment (moment about the mean): $\mu = 3$ and $\sigma^2 = 4$.  Use your formula from Part 1 to find $\alpha$ and $\beta$.
 +
 +=== Part 3 ===
 +
 +Now suppose that rather than the first moment and the second central moment, you are instead given both the first and second moments (about zero): $\mu = 6$ and $E[X^2] = 45$.  Find the variance and then use the mean and variance to find $\alpha$ and $\beta$.
 +
 += Maximum Likelihood =
 +
 +The book talks about MLE's on page 719.
 +
 +== Bernoulli ==
 +
 +Suppose that $x_1, x_2, \ldots x_n$ are independently and identically distributed as $\textrm{Bernoulli}(\theta)$ where $0 \le \theta \le 1$.  Thus:
 +
 +$f(x_i | \theta) = \theta ^ {x_i} (1 - \theta) ^ {1 - x_i},\, x_i \in \{0, 1\}.$
 +
 +Find the Maximum Likelihood Estimator for $\theta$ by defining $L(\theta; x_1, x_2, \ldots x_n)$ and taking the derivative of $\log\, L(\theta)$.
 +
 +== Uniform ==
 +
 +Suppose that $x_1, x_2, \ldots x_n$ are independently and identically distributed as $\textrm{Uniform}(\theta)$ where $0 \le \theta$.
 +
 +$f(x_i | \theta) = \begin{cases} \frac{1}{\theta} & \mathrm{if}\,​ 0 \le x_i \le \theta \\ 0 & \mathrm{otherwise}. \end{cases}$
 +
 +=== Part 1 ===
 +
 +Find a formula for $L(\theta; x_1, \ldots x_n)$.
 +
 +=== Part 2 ===
 +
 +Find $L(4;\, x_1=3,\, x_2=7,\, x_3=5,\, x_4=6)$ and then sketch the function $L(\theta;​\,​ x_1=3,\, x_2=7,\, x_3=5,\, x_4=6)$. Identify where the maximum likelihood occurs.
 +
 +=== Part 3 ===
 +
 +Find the MLE of $\theta$ in general (for any data $x_1, x_2, \ldots x_n$).
 +
 +What is the derivative at this point? Is it 0? Look at your graph from Part 2...
cs-677sp2010/mom-and-mle.txt ยท Last modified: 2014/12/12 20:32 by ryancha
Back to top
CC Attribution-Share Alike 4.0 International
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0