Normalization (Min-Max) *
Description
Normalization (Min-Max) technique scales the values of the feature to be between a specified range, typically between 0 and 1 (for regular machine learning models, and sometimes -1 and 1 for deep learning models).
Formula
Normal version:
\[ x' = \frac{x - \min(x)}{\max(x) - \min(x)} \]
where \(x\) is an original value, \(x'\) is the normalized value.
Generalized version (Restrict the range of values in the dataset between any arbitrary points a and b):
\[ X' = a + \frac{(X - X_{min})(b - a)}{X_{max} - X_{min}} \]
Example
from sklearn.preprocessing import MinMaxScaler
min_max_scaler = MinMaxScaler(feature_range=(-1, 1))
housing_num_min_max_scaled = min_max_scaler.fit_transform(housing_num)

