Introduction to Markov Chains#

Here, we will review the foundations that underly HMMs. Specifically, we will review…

  • Markov chains

  • Transitional probabilities

  • Real world examples of Markov chains


Introduction#

Before we begin looking at HMMs, we must first understand the Markov chain. This is the foundation for HMMs.


What is a Markov chain?#

A Markov chain is a model of stochasticity. That is, it tells us the likelihood that a state will occur given the previous state. This can be represented mathematically such that \(X\) is a state and \(n\) is the position of that state in a sequence.

\[ P(X_{n+1} = x | X_n = x_n) \]

However, there are a few important properties that one must keep in mind. Firstly, there exists the property of forgetfulness. This means that the future state, in this case, \(X_{n+1}\) is purely and only dependent on \(X_n\). It is not affected by by states \(X_{n-n}\). That is, it is not affected by any previous states other than the immediately preceding state. Secondly, the sum of the weights of transitions within any state \(X_n\) is equal to \(1\), where \(t\) is the weight, as represented by the equation below.

\[ \sum{t \in X_n} = 1 \]

More down to earth#

Let’s take a look at this in a more understanding way. The most used (and easly explainable) example has to do with the weather.

image

Here we are looking at a visual representation of the probabilities of weather patterns.

Lets say that today is Rainy. There is a 0.25 probability that it will be Sunny tomorrow and a 0.25 probability that it will be Cloudy tomorrow and a 0.50 probability that it will be Rainy again tomorrow. These weights along the arrows are called transition probabilities (TPs). Notice that the TPs from each state add up to 1.

TPs can be written such that the TPs are function of \(i, j, n\)

\[ p_{ij} (n) = P(X_{n+1} = j | X_n = i) \]

But how can we implement the equation above into this example? Let’s find out.


Using the visual#

Let’s say that Monday was Rainy and Tuesday was also Rainy and Wednesday was Cloudy and today is Rainy. What is the probability that tomorrow will be Rainy?

Well, we know that tomorrow’s weather is only dependant on today’s weather. So all we need to look at is the TP from Rainy to Rainy, which is 0.50!


Implementing in Python#

Below is an example of this exact weather problem implemented in Python.

# create a dictionary
weather  = {'Sunny':{'Self': 0.30, 'Rainy': 0.10, 'Cloudy': 0.60},
            'Rainy':{'Self':0.50, 'Sunny': 0.25, 'Cloudy': 0.25}, 
            'Cloudy': {'Self': 0.40, 'Sunny': 0.20, 'Rainy': 0.40}
}

# generate weather patterns
week = ['Sunny', 'Rainy', 'Rainy', 'Sunny']

# If current state is 'Sunny', what is the probability that tomorrow will be 'Rainy'.
previous_state = week[-1] # get the previous state
next_state = 'Rainy' # set the next state
probability = weather[previous_state][next_state] # find the TP in the dictionary
print(f"The probability that it will be {next_state} tomorrow if today is {previous_state} is {probability}")
The probability that it will be Rainy tomorrow if today is Sunny is 0.1