Differences

This shows you the differences between two versions of the page.

Link to this comparison view

cs-677:sampling-estimators-and-gm-s [2015/01/06 21:12] (current)
ryancha created
Line 1: Line 1:
 +===3.8===
  
 +Problem 11
 +
 +
 +=== Method of Moments ===
 +
 +==== Part 1 ====
 +
 +The Negative Binomial distribution can be parameterized as follows:
 +
 +$f(x; \alpha, \beta) = {x + \alpha - 1 \choose \alpha - 1} \left(\frac{\beta}{\beta+1}\right)^\alpha \left(\frac{1}{\beta+1}\right)^x,​\,​ x=0, 1, 2, \ldots$
 +
 +For common distributions,​ the moments are well known and appear in tables of distributions (in the worst case you can derive the formula your self as you did in the last homework). ​ In this case, the mean is:
 +
 +$\mu = E[X] = \frac{\alpha}{\beta}$
 +
 +and the variance is:
 +
 +$\sigma^2 = E[(X-E[X])^2] = \frac{\alpha}{\beta^2}(\beta + 1)$
 +
 +Find a formula for $\alpha$ and $\beta$ using Method of Moments.
 +
 +==== Part 2 ====
 +
 +Suppose that you are given the first moment ​ and  the second central moment (moment about the mean): $\mu = 3$ and $\sigma^2 = 4$.  Use your formula from Part 1 to find $\alpha$ and $\beta$.
 +
 +==== Part 3 ====
 +
 +Now suppose that rather than the first moment and the second central moment, you are instead given both the first and second moments (about zero): $\mu = 6$ and $E[X^2] = 45$.  Find the variance and then use the mean and variance to find $\alpha$ and $\beta$.
 +
 +=== Maximum Likelihood ===
 +
 +==== Bernoulli ====
 +
 +Suppose that $x_1, x_2, \ldots x_n$ are independently and identically distributed as $\textrm{Bernoulli}(\theta)$ where $0 \le \theta \le 1$.  Thus:
 +
 +$f(x_i | \theta) = \theta ^ {x_i} (1 - \theta) ^ {1 - x_i},\, x_i \in \{0, 1\}.$
 +
 +Find the Maximum Likelihood Estimator for $\theta$ by defining $L(\theta; x_1, x_2, \ldots x_n)$ and taking the derivative of $\log\, L(\theta)$.
 +
 +=== From Section 7.5 ===
 +
 +Problem 5
 +
 +===Exact Inference in the Discrete case===
 +
 +I '''​almost'''​ apologize for asking you to do this. It is a little mind numbing. Unfortunately I really want to make sure you have the basics of a graphical model. We will do even more later in the term.
 +
 +You can complete this task by build a simple script or even a spread sheet that infers the distributions given below. Keep this very simple and specific to this case. You do '''​not'''​ need to read in the probabilities and solve problems in general. You can even do it by hand if you really want to.
 +
 +Do not over engineer your solution. Just build a table for the joint distribution of A, B, C, and D and sum to obtained the needed probabilities. Please do NOT just down load a package to do it for you; the point right now is to understand where all the numbers go and what they mean. You are encouraged (not required) to download a package to '''​test'''​ your answer.
 +
 +
 +
 +==== Topology: ====
 +
 +           ​A ​       D
 +           ​| ​       |
 +           ​v ​       V
 +           ​B------->​C
 +
 +
 +==== Variables: ====
 +# A, C, and D are Boolean
 +# B can take 3 values 1, 2, or 3
 +
 +==== Probabilities:​ ====
 +# P(A=t) = 0.1
 +# P(B=1|A=t) = 0.2
 +# P(B=2|A=t) = 0.3
 +# P(B=1|A=f) = 0.4
 +# P(B=2|A=f) = 0.5
 +# P(C=t|B=1,​D=t) = 0.6
 +# P(C=t|B=1,​D=f) = 0.7
 +# P(C=t|B=2,​D=t) = 0.8
 +# P(C=t|B=2,​D=f) = 0.9
 +# P(C=t|B=3,​D=t) = 0.11
 +# P(C=t|B=3,​D=f) = 0.21
 +# P(D=t) = 0.31
 +
 +==== Find ====
 +
 +# p(B=1 | C=t, D=f)
 +# p(B=1 | C=t, D=t)
cs-677/sampling-estimators-and-gm-s.txt ยท Last modified: 2015/01/06 21:12 by ryancha
Back to top
CC Attribution-Share Alike 4.0 International
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0