Dropout

Dropout

Dropout is a regularization technique or a hyperparameter for Neural Networks that prevents overfitting.

It prevents network from overfitting by removing a random selection of a fixed number of the units in a network layer for a single gradient step. Dropout can be interpreted in various ways, such as randomly sampling from an exponential number of different networks.

  1. Dropout: A Simple Way to Prevent Neural Networks from Overfitting Paper and Summary