This shows you the differences between two versions of the page.

— |
cs-677sp2010:gibbs-homework [2014/12/12 20:35] (current) ryancha created |
||
---|---|---|---|

Line 1: | Line 1: | ||

+ | The point of this homework is to assure that you understand how to do Gibb's sampling. If you already get it, this will be easy. If it is still foggy you will have to fight your way through the fog. Be prepared to spend some time with your notes from class, the book and the sample code I handed out in class. | ||

+ | == Model == | ||

+ | |||

+ | Consider the following model: | ||

+ | |||

+ | $\theta \sim Gamma(\alpha_1, \beta_1)\,$ | ||

+ | |||

+ | $\lambda|\theta \sim Gamma(\alpha_2, \theta)\,$ | ||

+ | |||

+ | $Y|\lambda \sim Poisson(\lambda)\,$ | ||

+ | |||

+ | where $\theta$, $\lambda$, and $Y$ are random variables, and $\alpha_1$, $\beta_1$, and $\alpha_2$ are constants. | ||

+ | |||

+ | Use the "inverse scale" parameterization of the gamma distribution, that is: | ||

+ | |||

+ | $f(x | \alpha, \beta) = \frac{\beta^\alpha}{\Gamma(\alpha)} x^{\alpha - 1} e^{-\beta x}, \, x > 0$ | ||

+ | |||

+ | == Assignment == | ||

+ | |||

+ | Draw the above model as a Bayesian network when $Y$ is observed. | ||

+ | |||

+ | Recall that we can make inference about $\theta$ and $\lambda$ if we have a set of random samples from the joint distribution $f(\theta, \lambda | Y)\,$. Briefly explain how you would simulate random draws from this joint distribution using Gibbs sampling ("pure" Gibbs sampling, without Metropolis, which we will cover separately). Make sure you thoroughly derive and specify all distributions that you will need to sample from. Be detailed with your math. |