Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Last revision Both sides next revision
cs-401r:assignment-b [2014/12/23 15:10]
ringger [Language Modeling]
cs-401r:assignment-b [2014/12/23 15:11]
ringger [Labeling]
Line 55: Line 55:
  
 Once you have these probabilities you can use the Viterbi algorithm to find the most likely sequence of tags given an unlabeled sequence of text. The Viterbi algorithm is described in section 15.2.3 (for some reason, they do not tell you it is called Viterbi until the end of the section!). Once you have these probabilities you can use the Viterbi algorithm to find the most likely sequence of tags given an unlabeled sequence of text. The Viterbi algorithm is described in section 15.2.3 (for some reason, they do not tell you it is called Viterbi until the end of the section!).
-Repeat this task but for a second order Markov process for the transitions,​ where the transition probabilities depend on two POSs of context instead of just one. +Repeat this task but for a second order Markov process for the transitions,​ where the transition probabilities depend on two POSs of context instead of just one.  ​Lecture #23 includes some helpful explanation about how to extend the method to Markov order two.
 == Data == == Data ==
  
cs-401r/assignment-b.txt · Last modified: 2014/12/23 15:17 by ringger
Back to top
CC Attribution-Share Alike 4.0 International
chimeric.de = chi`s home Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0