Data Science has grown as a truly interdisciplinary field that borrows from computer science, mathematics, data analysis, statistics, etc. Its advancements have helped businesses around the globe make much more informed, data-backed decisions. As a result, today, companies realise the importance of the data they have acquired through the years.
Data scientists use advanced tools to assess current business scenarios using existing data, derive relationships and find insightful patterns. This method is known as Descriptive Analytics. Further, data scientists also study the effects and their causes, keeping various dependent and independent variables in mind, known as Predictive Analytics.
Since Predictive Analytics works by identifying cause and effect relationships, it is beneficial for making insightful decisions for the future. However, this is not as straightforward as it might seem. Any business has a lot of variables to deal with – including current insights, constraints, and more.
Check out our data science certifications to upskill yourself.
To predict accurately, you must consider these variables and arrive at the optimum solution. This is where Linear Programming comes into the picture. Linear Programming is an important technique that works algorithmically and helps data scientists to find the most optimal solution for various problems. Linear Programming considers all the essential variables, equalities, and inequalities to come to the final solution, which ensures that the prediction is foolproof.
In this article, let’s look at what is Linear Programming, the different methods of Linear Programming and a sample Linear Programming problem!
Explore our Popular Data Science Degrees
Linear Programming in Predictive Analytics
Before beginning with the technicalities, it is crucial to note that programming in the context of Linear Programming does not refer to computer or software programming. On the other hand, Linear Programming is essentially an optimisation technique (Linear Optimization) helpful in finding the best outcomes from mathematical models. To formulate a linear program, it is important to have an understanding of the basic elements of Linear Programming, which include:
- Decision Variables: This refers to the variables that we would like to determine, the unknowns.
- Objective Function: This refers to the linear function representing the quantities that need to be minimised or maximised.
- Constraints: This is a set of inequalities or equalities representing all the restrictions on our decision variable.
- Non-negative restrictions: This refers to an essential point of constraint in that the values of decision variables are non-negative.
With the basic terms settled, let’s now look at what approaches can one take while solving a Linear Programming problem.
Top Essential Data Science Skills to Learn in 2022
Solving Linear Programming
We can follow these four steps to solve a Linear Programming problem successfully:
- Identifying decision variables
- Developing the objective function
- Specifying the constraints
- Stating the non-negativity restrictions
We will dive deeper into these steps later when we look at a solved example of Linear Programming. But before that, let’s look at the various ways you can approach a Linear Programming problem. There are broadly four approaches to choose from:
- Graphical Method: Graphical method is the most basic method used to solve a Linear Programming problem in two variables. It is mostly used when there are only two decision variables to consider. The graphical method involves forming a set of linear inequalities and subjecting them to the relevant conditions or constraints. Then, the equations are plotted on the X-Y plane, and the area of intersection formed by plotting all linear equations is the feasible area. This area indicates a model’s values and provides the optimal solution.
- Simplex Method: This is a powerful method for solving Linear Programming problems, and it follows an iterative procedure to arrive at the optimal solution. In this approach, the essential variables are modified until the max or min value (as required) is achieved for the initial objective function.
- Northwest Corner and Least Cost Method: These are specific types of methods essentially used for transportation problems to determine the best way to transport products or goods. As a result, this is a handy optimisation method for supply-demand issues. The assumption for this method is that there is only one product. However, the demand for this product comes from various sources, which all cumulatively make up the total supply. Therefore, this method aims to minimise the cost of transportation.
- Solving using R: R is one of the most widely used tools for data science and data analysis. R makes it very easy to perform optimisation in just a few lines of code using the IpSolve package.
- Solving using open-source tools: The last method uses one of many open-source tools available for optimisation problems. One example of an open-source tool is OpenSolve, a linear optimiser for Excel and works seamlessly for up to 100 variables. Apart from that, CPLEX, MATLAB, Gurobi, etc., are some other useful open-source tools.
Read our popular Data Science Articles
Sample Graphical Solving of Linear Programming
During yearly festive seasons, a company takes two factors into account – X and Y – to create a user pack. The weight of the total package must be 5kg – and there must not be more than 4kg of Y, and at least 2kg of X. X and Y contribute to the entire profit as follows – Rs. 5 / kg for X and 6 / kg for Y.
Let’s try to solve this Linear Programming problem to arrive at the best mix which results in the highest profits for the company.
1. Working with our primary function
Our problem’s optimisation goal is profit maximisation. The profit contribution of X and Y is given to us in the problem statement. Now,
- Let a kg of X
- Let b kg of Y
- Our objective function then becomes -> c = 5*a + 6*b, and we need to maximise c.
We have a, b as decision variables, whereas c is our required function.
2. Developing the constraints from the problem
We are given the following constraints in the problem:
- Weight of the gift pack must be 5kg => a + b = 5
- Less than 4kg of Y and at least 2kg of X => x>=2; y<=4
3. Non-negative constraints
The quantities for X and Y should be positive => a, b>0
Now, let us quickly summarise the whole problem as we have laid it out so far:
We need to optimise c = 5a+6b under the following two conditions:
We’re using the graphical method to solve this problem, so let us consider a 2-dimension graph with the X-Y axis and try to plot the equations and inequations. We will have the following things with us:
- a + b = 5 is a straight line that cuts x-axis at (5,0) and y-axis at the point (0,5). Since we have an equality sign in our expression, we are sure that our feasible region lies in the area of the intersection of these lines.
- a >= 2 is a straight line cutting x-axis as (2,0). Since our expression has a greater than constraint, our feasible region falls to the RHS of our line.
- b <= 4 is a straight line cutting the y axis at (0,4). Since we have a lesser constraint, our feasible region is the area below the line.
- Finally, since a and b are both positive values, our area of concern is the first quadrant.
If you have plotted these lines and constraints on a graph sheet, you will have the final region that satisfies all the required conditions. The two points that lie on the most extreme of this line are possible considerations for profit maximisation. These are points (2,3) and (5,0). To find which out of these two gives better profits, we can simply put the points in our objective function and see which yields the best output:
- c = 5a + 6b ⬄ c = (5*2) +(6*3) = 28
- c = 5a + 6b ⬄ z = (5*5) +(6*0) = 25
As you can see, we get a higher profit value for option A. So, our solution that gives the best profits is as follows => 2kg of factor X and 3kg of factor Y!
There is no end to optimisation problems – especially when we talk in a business context. Businesses face optimisation challenges more frequently than they would like to. As a result, just the graphical method is not enough to solve more technical optimisation problems.
You need to understand important tools or programming languages to perform linear optimisation on multivariable problems successfully. But the good news is that it is not that difficult to get the hang of working on relevant tools or programming languages. The entire field of data science is highly welcoming, which makes it easier for people from any background to build a data science career, if they have an interest.
At upGrad, we have guided students from around the globe with diverse backgrounds and helped them develop the relevant theoretical knowledge and practical expertise required to succeed in data science. Check out our Executive Post Graduate Programme in Data Science for more information on our course structure, syllabus, and the upGrad advantage!
1. Is Linear Programming related to computer programming?
No, linear programming refers to performing optimisations under constraints. It is not related to computer programming in the traditional sense of the word.
2. When is Linear Programming needed the most?
You need linear programming when optimising some decision variables under various constraints. It comes in handy during predictive analytics and helps businesses make better predictions.
3. Does Linear Programming need to be done manually?
No, there are various tools - both open-source and otherwise - that can help you perform linear programming.