Homebreadcumb forward arrow iconBlogbreadcumb forward arrow iconData Sciencebreadcumb forward arrow iconMarkov Chain in Python Tutorial

Markov Chain in Python Tutorial

Last updated:
26th Mar, 2020
Read Time
8 Mins
share image icon
In this article
Chevron in toc
View All
Markov Chain in Python Tutorial


Has it ever crossed your mind how expert meteorologists make a precise prediction of the weather or how Google ranks different web pages? How they make the fascinating python applications in real world. These calculations are complex and involve several variables that are dynamic and can be solved using probability estimates.

Here lies the idea of Markov Chains; there are individual states (say, the weather conditions) where each state can randomly change into other states (rainy day can change into the sunny day), and these changes or transitions are probability-based. This article gives a brief introduction to the concept of Markov Chains and how Python Markov Chain can be utilized to code Markov Chain models in Python to solve real-world problems. If you are a beginner and would like to gain expertise in data science, check out our data science courses.

Content Overview

  • A brief introduction to the concepts of Markov Chain and Markov Property
  • Mathematical and graphical expression of Markov Chain
  • Python Markov Chain – coding Markov Chain examples in Python

Introduction to Markov Chain

To use Python Markov Chain for solving practical problems, it is essential to grasp the concept of Markov Chains. In 1906, Russian mathematician Andrei Markov gave the definition of a Markov Chain – a stochastic process consisting of random variables that transition from one particular state to the next, and these transitions are based on specific assumptions and probabilistic rules.

A fundamental mathematical property called the Markov Property is the basis of the transitions of the random variables. In other words, a Markov Chain is a series of variables X1, X2, X3,…that fulfill the Markov Property.

Principle of Markov Chain – Markov Property

A Markov Chain is based on the Markov Property. The theory of discrete-time Markov Property states that the probability of a random system changing from one particular state to the next transition state depends only on the present state and time and is independent of the preceding states.

The fact that the probable future state of a random process is independent of the sequence of states that existed before it makes the Markov Chain a memory-less process that depends only on the current state of the variable.

Read: Built in Data Structures in Python

The mathematical expression of the Markov Chain

In terms of a probability distribution, assume a system at time instance ‘n.’ Applying the principle of Markov property, the conditional distribution of the states at the following time instance, n+1, is independent of the states of the system at time instances 1, 2, …, n-1. 

Graphical representation of Markov Chain

Directed graphs are often used to represent a Markov Chain. In the directed graphs, the nodes indicate different likely states of the random variables while the edges denote the probability of the system moving from one state to another in the next time instance. To understand the representation, let us take the example of predicting the weather. Assume that the random variable is ‘weather,’ and it has three possible states viz. Weather = {sunny, rainy, snowy}. The Markov Chain for this scenario can be represented as:

markov chain python


In the graphical representation shown above, say the current observed state of the random variable is sunny. The probability of the random variable taking the value sunny at the next time instance is 0.8. It can also take the value snowy with a probability of 0.01, or rainy with a probability of 0.19. An important thing to note here is that the probability values existing in a state will always sum up to 1.

Check out The Trending Python Tutorial Concepts in 2024

Coding a Markov Chain in Python

To better understand Python Markov Chain, let us go through an instance where an example of Markov Chain is coded in Python. While solving problems in the real world, it is common practice to use a library that encodes Markov Chains efficiently. However, coding Markov Chain in Python is an excellent way to get started on Markov Chain analysis and simulation. Hence comes the utility of Python Markov Chain. Let us see how the example of weather prediction given in the previous section can be coded in Python. Begin by defining a simple class:

Explore our Popular Data Science Certifications

Having defined the MarkovChain class, let us try coding the weather prediction example as a representation of how Python Markov Chain works.


Read: Operators in Python

Parameterising Markov Chains using Transition Matrix

In the previous section, the Python code parameterised the Markov Chain using a dictionary that contained the probability values of all the likely state transitions. An alternative way of representing the transition probabilities is using a transition matrix, which is a standard, compact, and tabular representation of a Markov Chain.

upGrad’s Exclusive Data Science Webinar for you –

ODE Thought Leadership Presentation


In situations where there are hundreds of states, the use of the Transition Matrix is more efficient than a dictionary implementation. The Markov Chain class is modified as follows for it to accept a transition matrix:

Top Data Science Skills to Learn

The dictionary implementation was looping over the states names. However, in case of a Transition Matrix, the probability values in the next_state method can be obtained by using NumPy indexing:


Read our popular Data Science Articles


Markov Chains are an essential mathematical tool that helps to simplify the prediction of the future state of complex stochastic processes; it solely depends on the current state of the process and views the future as independent of the past. Utilising the Markov Property, Python Markov Chain coding is an efficient way to solve practical problems that involve complex systems and dynamic variables.

Be it weather forecasting, credit rating, or typing word prediction on your mobile phone, Markov Chains have far-fetched applications in a wide variety of disciplines. Depending on the nature of the parameters and the application, there are different concepts of Markov Chains. Python Markov Chain is a logical and efficient way to implement Markov Chains by coding them in Python.

If you are curious to learn about python, data science, check out IIIT-B & upGrad’s Executive PG Programme in Data Science which is created for working professionals and offers 10+ case studies & projects, practical hands-on workshops, mentorship with industry experts, 1-on-1 with industry mentors, 400+ hours of learning and job assistance with top firms.

Check out all trending Python tutorial concepts in 2024.


Rohit Sharma

Blog Author
Rohit Sharma is the Program Director for the UpGrad-IIIT Bangalore, PG Diploma Data Analytics Program.

Frequently Asked Questions (FAQs)

1What is a Markov transition matrix?

The probability of particular states changing from one to the other is contained in a transition matrix, which is a square matrix. You may use a transition matrix to conduct matrix multiplication, identify patterns, and make predictions using it. In a dynamic system, a Markov transition matrix is a square matrix that describes the probability of transitioning from one state to another. The probability of migrating from the state represented by that row to the other states is listed in each row. As a result, each row of a Markov transition matrix adds up to one. Transition matrices are used to describe how transitions between two states are produced. When occurrences are more or less likely as a result of past events, it is used.

2What is the absorbing state of the Markov chain?

According to probability theory, an absorbing Markov chain is one in which every state can reach an absorbing state. An absorbing state is one that you can't get out of after you've entered it. A Markov chain is said to be absorbing if there is at least one absorbing state present in it and it is possible to get from any state to at least one absorbing state in a limited number of steps. A transitory state is one that is not absorbed by an absorbing Markov chain.

3What are Hidden Markov Models (HMM)?

The HMM is a mathematical model in which the examined system is a Markov process with hidden or unobserved states. The Hidden Markov Model is used in machine learning and pattern recognition applications such as gesture recognition and speech recognition. In the probabilistic model, the Hidden Markov Model allows us to speak about seen or apparent events as well as hidden events. It also aids in the resolution of real-world issues such as Natural Language Processing (NLP) issues, Time Series, and many more. In HMM, two key assumptions are made. The present observation and the future state are completely dependent on the current state.

Explore Free Courses

Suggested Blogs

Data Mining Techniques & Tools: Types of Data, Methods, Applications [With Examples]
Why data mining techniques are important like never before? Businesses these days are collecting data at a very striking rate. The sources of this eno
Read More

by Rohit Sharma

07 Jul 2024

An Overview of Association Rule Mining & its Applications
Association Rule Mining in data mining, as the name suggests, involves discovering relationships between seemingly independent relational databases or
Read More

by Abhinav Rai

07 Jul 2024

What is Decision Tree in Data Mining? Types, Real World Examples & Applications
Introduction to Data Mining In its raw form, data requires efficient processing to transform into valuable information. Predicting outcomes hinges on
Read More

by Rohit Sharma

04 Jul 2024

6 Phases of Data Analytics Lifecycle Every Data Analyst Should Know About
What is a Data Analytics Lifecycle? Data is crucial in today’s digital world. As it gets created, consumed, tested, processed, and reused, data goes
Read More

by Rohit Sharma

04 Jul 2024

Most Common Binary Tree Interview Questions & Answers [For Freshers & Experienced]
Introduction Data structures are one of the most fundamental concepts in object-oriented programming. To explain it simply, a data structure is a par
Read More

by Rohit Sharma

03 Jul 2024

Data Science Vs Data Analytics: Difference Between Data Science and Data Analytics
Summary: In this article, you will learn, Difference between Data Science and Data Analytics Job roles Skills Career perspectives Which one is right
Read More

by Rohit Sharma

02 Jul 2024

Graphs in Data Structure: Types, Storing & Traversal
In my experience with Data Science, I’ve found that choosing the right data structure is crucial for organizing information effectively. Graphs
Read More

by Rohit Sharma

01 Jul 2024

Python Banking Project [With Source Code] in 2024
The banking sector has many applications for programming and IT solutions. If you’re interested in working on a project for the banking sector,
Read More

by Rohit Sharma

25 Jun 2024

Linear Search vs Binary Search: Difference Between Linear Search & Binary Search
In my journey through data structures, I’ve navigated the nuances of linear search vs binary search in data structure, especially when dealing w
Read More

by Rohit Sharma

23 Jun 2024

Schedule 1:1 free counsellingTalk to Career Expert
footer sticky close icon