Blog_Banner_Asset
    Homebreadcumb forward arrow iconBlogbreadcumb forward arrow iconArtificial Intelligencebreadcumb forward arrow iconWhat is Ridge Regression in Machine Learning?

What is Ridge Regression in Machine Learning?

Last updated:
16th Aug, 2023
Views
Read Time
8 Mins
share image icon
In this article
Chevron in toc
View All
What is Ridge Regression in Machine Learning?

Introduction

The goal of machine learning algorithms is to develop models that can correctly anticipate outcomes from novel, unforeseen input. Models may experience overfitting when dealing with complicated datasets, which causes them to perform remarkably well on training data but fall short of generalizing to fresh data. A potent method for preventing overfitting and improving the performance of linear regression models is ridge regression, often known as L2 regularization. The complexities of Ridge Regression, including its formula, implementation in Python using sklearn, and a comparison of Lasso and Ridge Regression, will be covered in detail in this in-depth guide.

Understanding Ridge Regression

Ridge Regression, a linear regression methodology, improves on the traditional least squares approach by introducing a penalty element in the loss function. This penalty term prevents the model from relying too much on a single feature, reducing the impacts of multicollinearity. A regularization parameter (or alpha) is used to alter it.

What is Multicollinearity?

In order to achieve a given level of accuracy, multicollinearity is a phenomenon where one predicted value in several regression models is linearly predicted with others.
Multicollinearity basically happens when more than two anticipated variables have substantial correlations with one another.

In modeled data, multicollinearity could be defined as the presence of a correlation between independent variables. Estimates of the regression coefficient may become inaccurate as a result.
It can potentially raise the standard errors of the regression coefficients and reduce the efficacy of any t-tests.

Ads of upGrad blog

In addition to increasing model redundancy and decreasing predictability’s effectiveness and dependability, multicollinearity can provide false results and p-values.
Multicollinearity can be introduced by using multiple data sources. This could happen as a result of limitations placed on linear or demographic models, an overly precise model, outliers, or model design or choice made during the data collection process.

Multicollinearity may be introduced during the data collection process if the data were gathered using an inappropriate sampling method. Even if the sample size is smaller than expected, it could still happen.

Because there are more variables than data, multicollinearity will be visible if the model is overspecified. You can avoid this while the model is being deployed.
Outliers (extreme variable values that might produce multicollinearity) can be removed to reverse multicollinearity.

Check out upGrad’s free courses on AI.

How does Ridge Regression function?

Let’s look at the mathematical formula for Ridge Regression to get a better understanding of how it functions:

In the formula:

– y = target variable.

– X = matrix of independent variables.

– β = coefficients of independent variables.

– λ = (lambda) regularization parameter.

L2 regularization is performed through ridge regression. In this, the square of the coefficients’ magnitude is increased by the penalty equivalent. The motive of minimization is as follows:

You can define its coefficients using a response vector y Rn and a prediction matrix X Rnp as follows:

  • λ is the deciding factor that determines the severity of the penalty term.
  • When λ = 0, the goal is comparable to basic linear regression. You will receive the same coefficients as with simple linear regression.
  • When λ =, the coefficients obtained are zero due to infinite weightage on the square of coefficients, as anything less than zero renders the goal endless.
  • When 0 < λ < ∞, the magnitude of λ determines the weightage that is assigned to the various aspects of the objective.
  • LS Obj λ (the sum of the squares of the coefficients) is the minimization objective.

In this case, LS Obj stands for Least Square Objective. This is the linear regression objective without regularization.

Ridge regression in r tends to generate some bias as the coefficients are shrunk down towards zero. However, it can also significantly reduce variance, giving you a superior mean-squared error. increases the ridge penalty while regulating shrinking. A large indicates a higher degree of shrinkage, and for different values of, different coefficient estimations can be obtained.

You will have a thorough understanding of the Ridge Regression formula and its importance in the field of machine learning by the end of this article via Executive PG Program in Machine Learning & AI from IIITB.

Where is Ridge Regression used?

When the number of predictor variables in a given set is more than the number of observations or when the dataset exhibits multicollinearity, it is used to build sparse models. It is mostly used to analyze multicollinearity in data from multiple regressions.

Top Machine Learning and AI Courses Online

Ridge Regression vs Lasso Regression

To avoid overfitting, two well-liked regularization methods are Ridge Regression and Lasso Regression (L1 regularization). Although they both aim to impose punishment terms, they differ in the kinds of penalties that are imposed.

Comparison of Penalty Terms:

In contrast to Lasso Regression, which adds the sum of the absolute values of the coefficients, Ridge Regression adds the sum of the squares of the coefficients to the loss function. Since the consequences change, people behave differently as a result.

Use Cases:

With smaller yet non-zero coefficients, Ridge Regression frequently keeps all pertinent features in the model. When the majority of characteristics affect the target variable, it works well and feature selection is not the main priority.

While performing feature selection, Lasso Regression has a tendency to reduce some of the coefficients to zero. It is helpful when a dataset contains a large number of redundant or useless attributes.

How does Ridge Regression deal with Multicollinearity?

Although least squares estimates have a tendency to be unbiased in the presence of multicollinearity, their enormous variances mean that they may be significantly off from the true value. By adding some bias to the regression estimates, ridge regression lowers the standard errors. In essence, it seeks to obtain more accurate estimates. Learn the complexities of this via the Executive PG Program in Data Science & Machine Learning from the University of Maryland.

Implementing Ridge Regression in Python 

In Python, there are many Ridge regression implementations available, including Ridge from the scikit-learn package and the statsmodels’ RidgeCV module. Ridge regression is implemented in the following code, which makes use of the Ridge class from Sklearn.linear_model.

In contrast to ordinary linear regression, which reduces the total of squared errors, ridge regression contains a penalty component that reduces the total of squared coefficients. This punishment time is known as the alpha value.

Ridge regression is accomplished in Python using the ridge regression sklearn module. The alpha argument is passed to the Ridge class, which determines how much regularization will be used.

The following example shows how to use ridge regression to anticipate Boston property prices using a dataset from the scikit-learn package. Once the data has been separated into test and training sets, the training set is utilized to develop a ridge regression model.

When a Ridge instance is created, Alpha is set to 0.1. The penalty word’s weight is determined by the alpha value. A higher alpha number implies that the penalty phrase is weighted more heavily, whereas a lower alpha number suggests that the penalty term is weighted less heavily. In this case, the alpha value is set at 0.1. This means that the punishment time will be given a weight of 0.1.

import numpy as np

class RidgeRegression:

def __init__(self, alpha=1.0):

        self.alpha = alpha  # Regularization strength (lambda)

def fit(self, X, y):

        # Add a column of ones to the feature matrix X for the intercept term

        X = np.c_[np.ones(X.shape[0]), X]

        n_features = X.shape[1]

        n_samples = X.shape[0]

        # Calculate the ridge matrix (X^T * X alpha * I) where I is the identity matrix

        ridge_matrix = np.dot(X.T, X) self.alpha * np.eye(n_features)

        # Calculate the ridge coefficients

        ridge_coefficients = np.linalg.inv(ridge_matrix).dot(X.T).dot(y)

        # Extract the intercept and coefficients

        self.intercept_ = ridge_coefficients[0]

        self.coef_ = ridge_coefficients[1:]

 def predict(self, X):

        # Add a column of ones to the feature matrix X for the intercept term

        X = np.c_[np.ones(X.shape[0]), X]

        return X.dot(np.concatenate(([self.intercept_], self.coef_)))

# Example usage:

if __name__ == “__main__”:

    # Generate some example data

    np.random.seed(42)

    X = 2 * np.random.rand(100, 3)

    y = 4 np.dot(X, np.array([3, 1.5, -2])) np.random.randn(100)

    # Make and apply the Ridge Regression model

    alpha = 0.1  # Regularization strength

    ridge_model = RidgeRegression(alpha=alpha)

    ridge_model.fit(X, y)

    # Make predictions on new data

    new_data = np.array([[1, 2, 3], [4, 5, 6]])

    predictions = ridge_model.predict(new_data)

    print(“Predictions:”, predictions)

In-demand Machine Learning Skills

Advantages of Ridge Regression

Ads of upGrad blog

Following are the advantages of Ridge Regression: 

  • Effective at dealing with multicollinearity: Ridge Regression lessens the effect of correlated features, making it appropriate for datasets with high collinearity.
  • Resistance to outliers: The penalty term reduces the impact of outliers, making predictions more reliable.
  • Consistent outcomes: Ridge Regression offers more reliable findings than simple least squares, particularly when working with noisy data.

Enroll for the Machine Learning Course from the World’s top Universities. Earn Masters, Executive PGP, or Advanced Certificate Programs to fast-track your career.

Disadvantages of Ridge Regression 

Following are the disadvantages of Ridge Regression:

  • Limited feature selection: Ridge Regression uses every feature in the model, which might not be ideal in situations where feature selection is essential.
  • Sensitivity to regularization parameter: The effectiveness of Ridge Regression depends on the regularization parameter that is used. To get the best results, this parameter needs to be adjusted precisely.

Conclusion

In this thorough article, we looked into Ridge and Lasso Regression, a potent machine learning method that helps improve the performance of linear regression models by preventing overfitting. Ridge Regression offers reliable models that generalize well to fresh, untested data by including a penalty component. Additionally, we contrasted Ridge Regression with Lasso Regression and outlined each method’s unique use cases. With this knowledge, you can use Ridge Regression to enhance the functionality of your machine learning models and produce predictions that are more precise. Understand the intricacies of the regression models via Advanced Certificate Program in GenerativeAI.

FAQs

Profile

Pavan Vadapalli

Blog Author
Director of Engineering @ upGrad. Motivated to leverage technology to solve problems. Seasoned leader for startups and fast moving orgs. Working on solving problems of scale and long term technology strategy.
Get Free Consultation

Select Coursecaret down icon
Selectcaret down icon
By clicking 'Submit' you Agree to  
UpGrad's Terms & Conditions

Our Popular Machine Learning Course

Frequently Asked Questions (FAQs)

1What sets Ridge Regression and Lasso Regression apart from one another?

The main distinction is in the kind of punishment they impose. While Lasso Regression utilizes the sum of the absolute values of the coefficients, Ridge Regression employs the sum of the squares of the coefficients.

2How is multicollinearity handled by the Ridge Regression?

Ridge Regression reduces the impact of multicollinearity by decreasing the coefficients of correlated variables towards zero.

3Can I use Ridge Regression to perform variable selection?

Ridge Regression does not exclude any variables from the model. Lasso Regression can be a better option if you need feature selection.

Explore Free Courses

Suggested Blogs

Artificial Intelligence course fees
5327
Artificial intelligence (AI) was one of the most used words in 2023, which emphasizes how important and widespread this technology has become. If you
Read More

by venkatesh Rajanala

29 Feb 2024

Artificial Intelligence in Banking 2024: Examples &#038; Challenges
5979
Introduction Millennials and their changing preferences have led to a wide-scale disruption of daily processes in many industries and a simultaneous g
Read More

by Pavan Vadapalli

27 Feb 2024

Top 9 Python Libraries for Machine Learning in 2024
75497
Machine learning is the most algorithm-intense field in computer science. Gone are those days when people had to code all algorithms for machine learn
Read More

by upGrad

19 Feb 2024

Top 15 IoT Interview Questions &#038; Answers 2024 – For Beginners &#038; Experienced
64351
These days, the minute you indulge in any technology-oriented discussion, interview questions on cloud computing come up in some form or the other. Th
Read More

by Kechit Goyal

19 Feb 2024

Data Preprocessing in Machine Learning: 7 Easy Steps To Follow
152276
Summary: In this article, you will learn about data preprocessing in Machine Learning: 7 easy steps to follow. Acquire the dataset Import all the cr
Read More

by Kechit Goyal

18 Feb 2024

Artificial Intelligence Salary in India [For Beginners &#038; Experienced] in 2024
908464
Artificial Intelligence (AI) has been one of the hottest buzzwords in the tech sphere for quite some time now. As Data Science is advancing, both AI a
Read More

by upGrad

18 Feb 2024

24 Exciting IoT Project Ideas &#038; Topics For Beginners 2024 [Latest]
758181
Summary: In this article, you will learn the 24 Exciting IoT Project Ideas & Topics. Take a glimpse at the project ideas listed below. Smart Agr
Read More

by Kechit Goyal

18 Feb 2024

Natural Language Processing (NLP) Projects &amp; Topics For Beginners [2023]
107378
What are Natural Language Processing Projects? NLP project ideas advanced encompass various applications and research areas that leverage computation
Read More

by Pavan Vadapalli

17 Feb 2024

45+ Interesting Machine Learning Project Ideas For Beginners [2024]
327692
Summary: In this Article, you will learn Stock Prices Predictor Sports Predictor Develop A Sentiment Analyzer Enhance Healthcare Prepare ML Algorith
Read More

by Jaideep Khare

16 Feb 2024

Schedule 1:1 free counsellingTalk to Career Expert
icon
footer sticky close icon