Feature Scaling techniques (rescaling, standardization, mean normalization, etc) are useful for all sorts of machine learning approaches and *critical* for things like k-NN, neural networks and anything that uses SGD (stochastic gradient descent), not to mention text processing systems.
Included examples: rescaling, standardization, scaling to unit length, using scikit-learn.
Read More ›