Lasso Regression

Just like Ridge Regression it uses a penalty / Regularisation term to regularise the slope of the Regression line and performs Feature Selection. where is the slope.

Minimize the sum of least squares and a penalty or Regularisation term.

Used to prevent Overfitting.

Also can exclude useless variables/features completely by minimizing their weights (slope) right down to zero. This allows us to do Feature Selection and also infer some kind of Feature Importance from the weight plots:

In this case the windspeed and humidity dont seem to play such a huge role. To find the optimal value for lambda use K-Fold Cross Validation.

Classification

This method also works for Logistic Regression Classification.

Explainability

Just like in Linear Regression with the added benefit of automatic Feature Selection and Regularisation.

Resources