Dropout technique
Dropout is one of the simplest and the most powerful regularization techniques. It prevents units from complex co-adapting by randomly dropping units from the network. [N. Srivastava et al., 2014] Below table is a list of famous deep networks which use dropout techniques.
Status of dropout technique usage in famous deep networks
| Model | Dropout layers | Remark |
|---|---|---|
| AlexNet [Alex Krizhevsky et al., 2012] | Used in two fully-connected layers | Won the 2012 ILSVRC (ImageNet Large-Scale Visual Recognition Challenge) |
| ZFNet [Matthew D. Zeiler et al., 2013] | Used in two fully-connected layers | Won the 2013 ILSVRC |
| VGG Net [Karen Simonyan et al., 2014] | Used in two fully-connected layers | Best utilized with simple and deep CNN |
| GoogLeNet [Christian Szegedy et al., 2015] | Used in one fully-connected layer | Won the 2014 ILSVRC |
| Generative Adversarial Networks [Ian J. Goodfellow et al., 2014] | Applied in training the discriminator net | Various usage such as feature extraction, generating artificial images |
| Generating Image Descriptions [Adrej Karpathy et al., 2014] | Used in all layers except in the recurrent layers | Combination of CNNs and RNNs |
| Spatial Transformer Networks [Max Jaderberg et al., 2015] | Used in all layers except the first convolutional layer | Introduce of a Spatial Transformer module |
Notices
The list will keep updated