## Hidden Markov Model (HMM)

The Hidden Markov Model (HMM) is a powerful statistical tool for modeling generative sequences that can be characterized by an underlying process generating an observable sequence. A hidden Markov model is a doubly stochastic process, with an underlying stochastic process that is not observable (hence the word hidden), but can be observed through another stochastic process that produces the sequence of observations. The hidden process consists of a set of states connected to each other by transitions with probabilities, while the observed process consists of a set of outputs or observations, each of which may be emitted by each state according to some probability density function (pdf). Depending on the nature of this pdf, several HMM classes can be distinguished. If the observations are naturally discrete or quantized using vector quantization.

The Hidden Markov Model is a finite set of states, each of which is associated with a (generally multidimensional) probability distribution Transitions among the states are governed by a set of probabilities called transition probabilities. In a particular state an outcome or observation can be generated, according to the associated probability distribution. It is only the outcome, not the state visible to an external observer and therefore states are “hidden” to the outside; hence the name Hidden Markov Model.

A hidden Markov model can be considered a generalization of a mixture model where the hidden variables (or latent variables), which control the mixture component to be selected for each observation, are related through a Markov process rather than independent of each other. In a hidden Markov model, the state is not directly visible, but output, dependent on the state, is visible. Each state has a probability distribution over the possible output tokens. Therefore the sequence of tokens generated by an HMM gives some information about the sequence of states.

An introduction to hidden markov model is as fallows. The diagram of a typical Markov models with 5 states in shown figure.1

*Figure 1: Markov Model with 5 states*

Such a model is called a left-right model with a start state S1 and a legal end state S5. The like hood of transition between states is governed by a state transition matrix given as follow:

A=

\( a_{1,1} \) | \( a_{1,2} \) | \( a_{1,3} \) | 0 | 0 |

0 | \( a_{2,2} \) | \( a_{2,3} \) | \( a_{2,4} \) | 0 |

0 | 0 | \( a_{3,3} \) | \( a_{3,4} \) | \( a_{3,5} \) |

0 | 0 | 0 | \( a_{4,4} \) | \( a_{4,5} \) |

0 | 0 | 0 | 0 | \( a_{5,5} \) |

The variable \( a_{i,j} \) represents the like hood of making a transition from state \( S_i \) to state \( S_j \). According to the particular structure chosen, some transitions are not permitted and therefore have a like hood of 0.

The word ‘hidden’ in “Hidden Markov Model” has much the same meaning as the word hidden when referring to hidden “hidden layers” in an artificial neural network. Just as the single layer neural network needs a second ‘hidden’ layer of weights in order to be able to model complex functions, the Markov chain needs a second ‘hidden’ random process in order to model complex observations.

A *hidden Markov model* (HMM) is one in which you observe a sequence of emissions, but do not know the sequence of states the model went through to generate the emissions. Analyses of hidden Markov models seek to recover the sequence of states from the observed data.

As an example, consider a Markov model with two states and six possible emissions. The model uses:

- A red die, having six sides, labeled 1 through 6.
- A green die, having twelve sides, five of which are labeled 2 through 6, while the remaining seven sides are labeled 1.
- A weighted red coin, for which the probability of heads is 0.9 and the probability of tails, is 0.1.
- A weighted green coin, for which the probability of heads is 0.95 and the probability of tails, is 0.05.

The model creates a sequence of numbers from the set {1, 2, 3, 4, 5, 6} with the following rules:

- Begin by rolling the red die and writing down the number that comes up, which is the emission.
- Toss the red coin and do one of the following
- If the result is heads, roll the red die and write down the result.
- If the result is tails, roll the green die and write down the result.

- At each subsequent step, you flip the coin that has the same color as the die you rolled in the previous step. If the coin comes up heads, roll the same die as in the previous step. If the coin comes up tails, switch to the other die.

The state diagram for this model has two states, red and green, as shown in the following figure:

figure 2 State Diagram

We determine the emission from a state by rolling the die with the same color as the state. We determine the transition to the next state by flipping the coin with the same color as the state.

The transition matrix is:

The model is not hidden because we know the sequence of states from the colors of the coins and dice. Suppose, however, that someone else is generating the emissions without showing you the dice or the coins. All we see is the sequence of emissions. If we start seeing more 1s than other numbers, we might suspect that the model is in the green state, but we cannot be sure because that we can’t see the color of the die being rolled.

### References

[1]http://www.mathworks.com/help/stats/hidden-markov-models-hmm.html

[2]Phil Blunsom, “Hidden Markov Models “, August 19, 2004

[3]Neann Mathai, “A Literature Survey of Speech Recognition and Hidden Markov Models “, University of Cape Town, Computer Science Department, MHTNEA001@uct.ac.za

[4]Sanjay Shimpi and Vijay Patil, “Hidden Markov Model as Classifier: A survey”, / Elixir Comp. Sci. & Engg. 56A (2013) 13530-13533

## No Comments