- Lecture 11, slides:
- slide 32: why the approximation is correct? $E_{p^*}[\log p(X)] \approx 1/n \sum{\log p(x)}$
- slide 38: what is the meaning of the constraint $\sum_j{n_j / \lambda} = 1$, and why we didn't used it inside the Lagrangian itself?
- slides 44-46: the proof of ML estimate - it's not clear at all. ill appreciate if someone could refer to the proof, purpose, etc.

- Recitation 12:
- section 2.1.2, two bits: the derivation of the likelihood w.r.t $p_{0|0}$ isn't clear. shouldn't it be $\sum{\delta_{x_1^i = 0}/p_0 + \delta_{x_1^i=0,x_2^i=0}/p_{0|0}}$ ?

- Lecture 12, slides
- page 2, top: what the meaning of "Thus IID samples from a mixture distribution correspond to marginalization over n latent variables." the term marginalization is confusing as it wasn't defined.
- page 3: the part of rewriting $Q(\eta | \eta_t)$ - the equalities doesn't clear to me, and ill appreciate additional explanation
- page 4: equations 12.8, 12.9 again not that clear.

Thanks!

*סליחה אם חפרתי, פשוט יש הרבה דברים לא מובנים*

Unfold
Question regarding scribes and presentation (lecture 11,12) by studenta (guest), 21 Jan 2017 14:13