Do You Want to Exercise Your Brain Cells in the Field of NLP?
Do you want to exercise your brain cells in the field of NLP – natural language processing? I have an advanced task for you that I just received and solved at the 'Russian School'.

Do you want to exercise your brain cells in the field of NLP – natural language processing? I have an advanced task for you that I just received and solved at the "Russian School".
Let’s practise the “Hidden Markov Model” with three hidden states: N (noun), V (verb), and O (other). Let all transitions between the individual states be equal (having the same probability).
Assume the following possible outputs:
N: mimsy | borogoves
V: were | borogoves
O: All | mimsy | the
Now, let’s test this sentence: “All mimsy were the borogoves”.
Control questions for you:
- What tags can we create in accordance with the Hidden Markov Model (e.g., ONVON…)?
- Which tag has the highest probability (note, there’s a bit of a trick here)?
- The transition probability p (V | O), that is from O to V, is 1/3 in this model. Calculate one iteration of the Baum-Welch algorithm and explain how the probability changes (hint: you need to estimate the probabilities for (O -> V) and (O -> ?) and calculate their ratio).
Challenge: I’ll buy a beer for the first person who writes all three correct answers here!
Example inspired by: https://www.coursera.org/learn/language-processing/home/week/2


Původní zdroj: wordpress