**Introduction**

Probability has been an important aspect when it comes to the field of Data Science. It has played a pivotal role in the lives of data analysts and data scientists. The concepts used in probability theory are a must-know for people in the Data Science domain. The statistical methods used for making certain predictions are based upon the theories of probability and statistics thus making probability a crucial part of the data science domain.

Probability gives information about occurring of a certain event under some assumptions i.e. It indicates the likelihood of an event occurring. To represent the different possible values that a random variable can take, we make use of probability distribution.

A random variable can be referred to as the different outcomes which are possible in a given situation. To illustrate, if a die is rolled, then the possible outcomes for this situation are values ranging from 1 to 6 which become the values of the random variable.

Probability Distribution can be of two types: – Discrete and Continuous. Discrete distributions are for the variables which take only a limited number of values within a range. Continuous distributions are for variables that can take an infinite number of values within a range.Â In this article, we will be exploring more into the discrete distribution and later into Probability Mass Function.

**Discrete Distribution**

The discrete distribution represents the probabilities of the different outcomes for a discrete random variable. In simple terms, it allows us to understand the pattern of the different outcomes in the random variable. It is nothing but the representation of all the probabilities of a random variable put together.

To create a probability distribution for a random variable,Â we need to have the outcomes of the random variable along with its associated probabilities and then we can compute its probability distribution function.

Some of the types of discrete distributions are listed as follows: –Â

- Binomial Distribution: – The number of outcomes in a single trial can only be two (yes or no, success or failure, etc). Example: – Tossing of a coinÂ
- Bernoulliâ€™s Distribution: – A special version of Binomial distribution where the number of trials conducted in the experiment is always equal to 1.Â
- Poisson Distribution: – It provides the probability of an event occurring a certain number of times in a specific period of time. Example: – Number of times a movie will be streamed on a Saturday night.
- Uniform Distribution: – This distribution assumes that the probability for all the outcomes in a random variable is the same. Example: – Rolling of a die (as all the sides have an equal probability of showing up).

You can refer to this link for more details on types of continuous and discrete distributions. To calculate the probability of a random variable with its value equal to some value within the range, Probability Mass Function (PMF) is used. For every distribution, the formula of probability mass function varies accordingly.

For better clarity on probability mass function, let us walk through an example. Suppose we have to figure out which of the batting positions in cricket has more probability of scoring a century within a team, provided we have some related data. Now as there can be only 11 playing positions in the team, the random variable will take values ranging from 1 to 11.

Probability Mass Function, also called Discrete Density Function will allow us to find out the probability of scoring a century for each position i.e. P(X=1), P(X=2)….P(X=11). After the computation of all the probabilities, we can compute the probability distribution of that random variable.

The general formula for probability mass function is as follows: –Â

**P****X****(x****k****) = P(X = x****k****) for k = 1,2,…k**

where,Â

X = Discrete random variable.

xk= Possible value of the random variable.

P = Probability of the random variable when it equals xk.

Many get themselves into the confusion between Probability Mass Function (PMF) and Probability Density Function (PDF). To clear this out, the probability mass function is for the discrete random variables i.e. the variables that can take a limited number of values within a range.

The probability density function is used for the continuous random variables. i.e. the variables that can take an infinite number of values in a range. The probability mass function helps in the calculation of the general statistics like mean and variance of the discrete distribution.

**Properties of Probability Mass Function**

- Â The probabilities of all the possible values of the random variable should sum up to 1. [ âˆ‘PX(xk) = 1]
- All the probabilities have to be can either be 0 or greater than 0. [P(xk) â‰¥ 0]
- The probability of each event occurring ranges from 0 to 1. [1 â‰¥ P(xk) â‰¥ 0]

**Conclusion**

The concepts of probability like Probability Mass Function have been very useful in the data science domain. These concepts may not be used in every aspect of a data science project or for that matter in the entire project as well. But this does not belittle the importance of probability theory in this domain.

The applications of probability theory have provided great results not only in the data science domain but other domains of the industry also as it can help in interesting insights and decision-making which always makes it worth a try.

This article provided an overview of the importance of probability in the field of data science, introduced the basic concepts of probability like probability distribution and probability mass function. The article has mainly focused on the discrete variable terms as probability mass function is used for them. The terminologies used for the continuous variables are different, but the overall ideology of these concepts remain similar to the one explained in this article.

If you are curious to learn about data science, check out IIIT-B & upGradâ€™sÂ PG Diploma in Data ScienceÂ which is created for working professionals and offers 10+ case studies & projects, practical hands-on workshops, mentorship with industry experts, 1-on-1 with industry mentors, 400+ hours of learning and job assistance with top firms.