Search

Machine Learning: Understanding Regularization and Feature Scaling

4 min read
1 views

In the world of data normalization and Regularization is a powerful technique used in machine learning models to tackle the common issue of cost function of the model. This penalty term controls the complexity of the model, preventing it from becoming too intricate and thereby reducing the risk of overfitting.

There are primarily two types of regularization: L1 Regularization, which is associated with Mean Square Error (MSE). Between the two, L2 generally outperforms L1 and is computationally more efficient.

Feature scaling standardizes the range of features in the data during the preprocessing stage. Many machine learning algorithms, such as normalization seem similar, they serve different purposes in data preprocessing. Data normalization is a linear scaling technique that transforms data values (Xi) within a range of 0 and 1.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!