Differences
This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
courses:rg:2011:deciphering_foreign_language [2012/01/07 14:06] tran |
courses:rg:2011:deciphering_foreign_language [2012/01/08 22:27] (current) tran |
||
|---|---|---|---|
| Line 34: | Line 34: | ||
| If we use traditional EM, every time we update < | If we use traditional EM, every time we update < | ||
| - | __Practical question:__ How to initiate EM? | + | __**Practical questions:**__ How to initiate EM? How to start the first iteration? |
| + | |||
| + | **Some other notes related to this paper:** | ||
| + | - Generative story: The generative process that generates data given some hidden variables. | ||
| + | - [[http:// | ||
| + | - Gibbs sampling: | ||
| + | |||
| + | Why did they experiment with Temporal expression corpus? This corpus has relatively small word types, it makes easier to compare Iterative EM with full EM. | ||
| + | |||
| + | ==== Section 3 ==== | ||
| + | Not many details of this section was presented, however, there are few discussions around this. | ||
| + | |||
| + | How to choose the best translation? | ||
| + | |||
| + | Given another text (which is not in training data), how to translate it? Use MLE to find the best translation from the model. | ||
| + | |||
| + | ==== Conclusion ==== | ||
| + | This is an interesting paper, however, there is a lot of maths behind. | ||
