Hi folks,
The feature scaling is the most important step in data preparation. Whether to use feature scaling or not depend upon the algorithm you are using.
Many of us, still wondering why feature scaling requires? Why we need to scale the variables?
1. Having features on same scale that can contribute equally to the result. Can enhance the performance of machine learning algorithms.
2. If you don't scale features then large scale variables will dominate the small scale features.
Example: Suppose, the dataset contains X variable(might be of 2 digit number) and Y variable(might be 5-6 digit number) variables. There is huge gap in scale. As we don't want our algorithm to be biased towards one feature.This will effect the accuracy of model or towards the performance of algorithm or might get wrong predictions.
Certain machine learning algorithms such as distance based algorithms , curve based algorithms or matrix factorization, decomposition or dimensionality reduction or gradient descent based algorithms are sensitive towards feature scaling (standardization and normalization for numerical variables).
And there are certain tree based algorithms which are insensitive towards feature scaling as they are rule based algorithms such as Classification and Regression trees, Random Forests or Gradient Boosted decision Trees.
The idea behind using standardization before applying machine learning algorithm is to transform you data such that its distribution will have mean value 0 and standard deviation as
Mu=0
Sd=1
Normalization:
This method will scale/shift/rescale the data between the range of 0 and 1. So, this is also called as Min-Max scaling.
Cons: This method will make us to lose some of the information of the data such as outliers.
For most of the applications, Standardization method performances better than Normalization.
**Note: For the best possible results, you need to start fitting the actual whole model(default), normalized and standardized and compare the results.
Hope! This is useful.
Happy Learning!😊🙋
The feature scaling is the most important step in data preparation. Whether to use feature scaling or not depend upon the algorithm you are using.
Many of us, still wondering why feature scaling requires? Why we need to scale the variables?
1. Having features on same scale that can contribute equally to the result. Can enhance the performance of machine learning algorithms.
2. If you don't scale features then large scale variables will dominate the small scale features.
Example: Suppose, the dataset contains X variable(might be of 2 digit number) and Y variable(might be 5-6 digit number) variables. There is huge gap in scale. As we don't want our algorithm to be biased towards one feature.This will effect the accuracy of model or towards the performance of algorithm or might get wrong predictions.
Certain machine learning algorithms such as distance based algorithms , curve based algorithms or matrix factorization, decomposition or dimensionality reduction or gradient descent based algorithms are sensitive towards feature scaling (standardization and normalization for numerical variables).
And there are certain tree based algorithms which are insensitive towards feature scaling as they are rule based algorithms such as Classification and Regression trees, Random Forests or Gradient Boosted decision Trees.
Cons of Feature Scaling: You will lose the original value will transforming to other values. So, there is loss of interpretation of the values.
Standardization v/s Normalization
Standardization:The idea behind using standardization before applying machine learning algorithm is to transform you data such that its distribution will have mean value 0 and standard deviation as
Mu=0
Sd=1
Normalization:
This method will scale/shift/rescale the data between the range of 0 and 1. So, this is also called as Min-Max scaling.
Cons: This method will make us to lose some of the information of the data such as outliers.
For most of the applications, Standardization method performances better than Normalization.
**Note: For the best possible results, you need to start fitting the actual whole model(default), normalized and standardized and compare the results.
Hope! This is useful.
Happy Learning!😊🙋
No comments:
Post a Comment