Ridge regression modifies the ordinary least squares (OLS) method by adding a penalty term proportional to the sum of the squares of the coefficients. This penalty, controlled by a parameter called the regularization parameter (λ), shrinks the coefficients towards zero but not exactly zero, unlike Lasso regression. The modified loss function can be expressed as:
Loss = RSS + λ * Σβ²
where RSS is the residual sum of squares, and β represents the coefficients. By tuning λ, researchers can control the trade-off between fitting the data perfectly and keeping the model simple.