Suppose you want to use a HMM tagger to tag the phrase, “beautifulflowers
blossom”, where we have the following probabilities:
P(beautiful|Det)=0.001,P(beautiful|Noun)=0.1,P(beautiful|Adjective)=0.3,
P(flowers|Noun) =0.1, P(f1owers|Verb) = 0.000l, P(blossom|Noun) = 0.08,
P(blossom|Verb) = 0.9
P(Verb|Det) = 0.00001, P{Noun|Det) = 0.5, P(Adj|DET) = 0.01,
P(Noun|Noun) =0.2, P(Adj|Noun) = 0.002, P(Noun|Adj) = 0.5,
P(Noun|Verb) = 0.3, P(Verb|Noun) = 0.07, P(Verb|Adj) = 0.00l,
P(Verb|Verb) = 0.1
Work out in details the steps of the Viterbi algorithm to assign the most
probable tag to the given phrase. Assume all other conditional probabilities, not
mentioned to be zero. Also, assume that all the four tags have the same
probabilities to appear in the beginning of a sentence. Draw the Tag transition
and word probability matrix and Transition diagram for the given scenario.
Answers
Answered by
6
Answer:
Noun" (and any subsequent words) was ignored because we limit queries to 32 words.
Similar questions