Hands-On Natural Language Processing with Python
上QQ阅读APP看书,第一时间看更新

Regularization techniques

Overfitting is a problem in ML, where a model blindly learns all of the patterns in the data, including noise. A neural network can easily be overfitted during training, due to the availability of a large number of parameters. Theoretically, given any size of input data, a large enough Artificial Neural Network (ANN) can memorize all of the patterns in it, along with noise. Therefore, the weights of models have to be regularized to avoid overfitting the data.

We will look at three types of regularization:

  • Dropout
  • Batch normalization
  • L1 and L2 normalization