Homebreadcumb forward arrow iconBlogbreadcumb forward arrow iconArtificial Intelligencebreadcumb forward arrow iconDecision Tree in Machine Learning Explained [With Examples]

Decision Tree in Machine Learning Explained [With Examples]

Last updated:
21st Dec, 2020
Read Time
9 Mins
share image icon
In this article
Chevron in toc
View All
Decision Tree in Machine Learning Explained [With Examples]


Decision Tree Learning is a mainstream data mining technique and is a form of supervised machine learning. A decision tree is like a diagram using which people represent a statistical probability or find the course of happening, action, or the result. A decision tree example makes it more clearer to understand the concept.

Top Machine Learning and AI Courses Online

The branches in the diagram of a decision tree shows a likely outcome, possible decision, or reaction. The branch at the end of the decision tree displays the prediction or a result. Decision trees are usually used to find a solution for a problem which gets complicated to solve manually. Let us understand this in detail with the help of a few decision tree examples.

A decision tree is one of the popular as well as powerful tools which is used for prediction and classification of the data or an event. It is like a flowchart but having a structure of a tree. The internal nodes of the trees represent a test or a question on an attribute; each branch is the possible outcome of the question asked, and the terminal node, which is also called as the leaf node, denotes a class label.

Ads of upGrad blog

In a decision tree, we have several predictor variables. Depending upon these predictor variables, try to predict the so-called response variable.

Trending Machine Learning Skills

Enrol for the Machine Learning Course from the World’s top Universities. Earn Masters, Executive PGP, or Advanced Certificate Programs to fast-track your career.

Related Read: Decision Tree Classification: Everything You Need to Know

Decision Tree in ML

By representing a few steps in the form of a sequence, the decision tree becomes an easy and effective way to understand and visualize the possible decision options and the potential outcomes from the range. The decision trees are also helpful in identifying possible options and weighing the rewards and risks against each course of action that can be yielded. 

A decision tree is deployed in many small scale as well as large scale organizations as a sort of support system in making decisions. Since a decision tree example is a structured model, the readers can understand the chart and analyse how and why a particular option may lead to a corresponding decision. The decision tree example also allows the reader to predict and get multiple possible solutions for a single problem, understand the format, and the relation between different events and data with the decision.  

Each result in the tree has a reward and risk number or weight assigned. If you ever use a decision tree, then you will have every final result with a possible drawback and benefit. To conclude your tree properly,  you can span it as short or as long as needed depending on the event and the amount of data. Let us take a simple decision tree example to understand it better. 

Consider the given data which consists of the details of people like: whether they are drinker, smoker, their weight, and the age at which these people died.

NameDrinkerSmokerWeightAge (Died)

Let us try to predict if the people will die at a younger age or older age. The characteristics like drinker, smoker, and the weight will act as a predictor value. Using these, we will consider age as a response variable.

Let us label that people who died before the age of 70 died “young” and people who died after the age of 70 died “old”. Let us now predict the response variable based on the predictor variable. Given below is a decision tree made after learning the data.

The decision tree above explains that, if a person is a smoker, they die young. If a person is not a smoker, then the next factor considered is if the person is a drinker or not. If a person is not a smoker and not a drinker, the person dies old.

If a person is not a smoker and is a drinker, then the weight of the person is considered. If a person is not a smoker, is a drinker, and weighs below 90 kg, then the person dies old. And lastly, if a person is not a smoker, is a drinker, and weighs above 90 kg, then they die young. 

From the data given let’s take Jonas’ example to check if the decision tree is classified correctly and if it predicts the response variable correctly. Jonas is not a smoker, is a drinker, and weighs under 90 kg. According to the decision tree, he will die old (age at which he dies>70). Also, according to the data, he died when he was 88 years old, this means the decision tree example has been classified correctly and worked perfectly.

But did you ever wonder about the basic idea behind the working of a decision tree? In a decision tree, the set of instances is split into subsets in a manner that the variation in each subset gets smaller. That is, we want to reduce the entropy, and hence, the variation is reduced and the event or instance is tried to be made pure.

Let us consider a similar decision tree example. Firstly, we consider if the person is a smoker or not. 

Here, we are uncertain about the non-smokers. So, we split it into drinker and nondrinker.

We can see from the diagram given below that we went from a high entropy having large variation to reducing it down to a smaller class in which we are more certain about. In this manner, you can incrementally build any decision tree example.

Let us construct a decision tree using the ID3 Algorithm. What is more important in the decision tree is a strong understanding of Entropy. Entropy is nothing but the degree of uncertainty. It is given by:

(At times, it is also denoted by “E”)

If we apply it to the above example, it will go as follow:

Consider the case when we don’t have people split into any category. It is a worst-case scenario (high entropy) when both types of people have the same amount. The ratio here is 3:3.





Similarly, for people who do not drink, have 1:1 ratio and the entropy would be 1. Thus, it needs a further split due to uncertainty. For people who do not drink, the ratio is 2:0. Hence, the entropy is 0.

Now, we have computed the entropy for the different cases and hence, we can calculate the weighted average for the same. 

For the first branch, E=661=1 

For the Smoker class, E=260+ 460.811=0.54

For the smoker and drinker class, E=260+ 261+260=0.33

The diagram below will help you in quickly understanding the above calculations.

Finally, the information gain:

ClassEntropyInformation gain (E2-E1)
Ads of upGrad blog

Also Read: Decision Tree Interview Questions & Answers

Popular AI and ML Blogs & Free Courses


We have successfully studied decision trees in-depth right from the theory to a practical decision tree example. We also constructed a decision tree using the ID3 algorithm. If you found this interesting, you might love to explore data science in detail.  

If you’re interested to learn more about decision trees, machine learning, check out IIIT-B & upGrad’s PG Diploma in Machine Learning & AI which is designed for working professionals and offers 450+ hours of rigorous training, 30+ case studies & assignments, IIIT-B Alumni status, 5+ practical hands-on capstone projects & job assistance with top firms.



Blog Author
Meet Sriram, an SEO executive and blog content marketing whiz. He has a knack for crafting compelling content that not only engages readers but also boosts website traffic and conversions. When he's not busy optimizing websites or brainstorming blog ideas, you can find him lost in fictional books that transport him to magical worlds full of dragons, wizards, and aliens.
Get Free Consultation

Select Coursecaret down icon
Selectcaret down icon
By clicking 'Submit' you Agree to  
UpGrad's Terms & Conditions

Our Popular Machine Learning Course

Frequently Asked Questions (FAQs)

1What are decision trees?

Decision trees are used to visually organize and organize decision making information. The trees are drawn such that the root is at the top and the leaves are at the bottom. The decision trees are read from the bottom up, moving from left to right. Each level of the tree is a base for further testing and the decisions at each level will narrow the scope until the question is answered. A decision tree breaks a problem or decision into multiple sub-decisions and follows the logical path to the root, which is the primary goal. Decision trees are used to analyze the business environment, to prioritize and to provide insight, in order to make decisions on what direction to take.

2What are the issues in decision tree learning in machine learning?

Decision trees can be used as a basis for testing new strategies or to explain strategies to others. A decision tree explains what will happen under a given set of assumptions. They can also be used to evaluate the performance of a strategy that was used in the past. Decision trees are known to be too susceptible to errors because of all their branches. Decision trees are not always accurate because, sometimes, they don’t take into account all possible variables, and the person analyzing the decision tree might not be experienced in all the aspects of the particular situation.

3What kind of data is best for Decision Trees?

Decision Trees help you find patterns in data using a flow chart like structure. The best type of data would be qualitative, categorical and numerical. Although Decision Trees work with all types of data, they work best with numerical data. They must be able to have values that are numbers or there should be a way to translate them into numbers. Decision Trees are heavily dependent on the type of data as well as the quantity. If the number of data points is more than 100, Decision Trees would be a good model.

Explore Free Courses

Suggested Blogs

Data Preprocessing in Machine Learning: 7 Easy Steps To Follow
Summary: In this article, you will learn about data preprocessing in Machine Learning: 7 easy steps to follow. Acquire the dataset Import all the cr
Read More

by Kechit Goyal

29 Oct 2023

Natural Language Processing (NLP) Projects & Topics For Beginners [2023]
What are Natural Language Processing Projects? NLP project ideas advanced encompass various applications and research areas that leverage computation
Read More

by Pavan Vadapalli

04 Oct 2023

15 Interesting MATLAB Project Ideas & Topics For Beginners [2023]
Learning about MATLAB can be tedious. It’s capable of performing many tasks and solving highly complex problems of different domains. If youR
Read More

by Pavan Vadapalli

03 Oct 2023

Top 16 Artificial Intelligence Project Ideas & Topics for Beginners [2023]
Summary: In this article, you will learn the 16 AI project ideas & Topics. Take a glimpse below. Predict Housing Price Enron Investigation Stock
Read More

by Pavan Vadapalli

27 Sep 2023

Top 15 Deep Learning Interview Questions & Answers
Although still evolving, Deep Learning has emerged as a breakthrough technology in the field of Data Science. From Google’s DeepMind to self-dri
Read More

by Prashant Kathuria

21 Sep 2023

Top 8 Exciting AWS Projects & Ideas For Beginners [2023]
AWS Projects & Topics Looking for AWS project ideas? Then you’ve come to the right place because, in this article, we’ve shared multiple AWS proj
Read More

by Pavan Vadapalli

19 Sep 2023

Top 15 IoT Interview Questions & Answers 2023 – For Beginners & Experienced
These days, the minute you indulge in any technology-oriented discussion, interview questions on cloud computing come up in some form or the other. Th
Read More

by Kechit Goyal

15 Sep 2023

45+ Interesting Machine Learning Project Ideas For Beginners [2023]
Summary: In this Article, you will learn Stock Prices Predictor Sports Predictor Develop A Sentiment Analyzer Enhance Healthcare Prepare ML Algorith
Read More

by Jaideep Khare

14 Sep 2023

Why GPUs for Machine Learning? Ultimate Guide
In the realm of modern technology, the convergence of data and algorithms has paved the way for groundbreaking advancements in artificial intelligence
Read More

by Pavan Vadapalli

14 Sep 2023

Schedule 1:1 free counsellingTalk to Career Expert
footer sticky close icon