Cross Validated Asked on January 3, 2022
For some context, I shall outline my current understanding:
Considering a Neural Network, for a Binary Classification problem, the Cross-entropy cost function, J, is defined as:
$ J = frac{-1}{m} sum_{i=1}^m y^i*log(a^i) + (1-y^i)*log(1-a^i) $
Dropout regularisation works as follows: For a given training example, we randomly shut down some nodes in a layer according to some probability. This has the effect of keeping the weights low during training and hence regularises the network and prevents overfitting.
I have learnt that if we do apply dropout regularisation, the cross entropy cost function is no longer easy to define due to all the intermediate probabilities. Why is this the case? Why doesn’t the old definition still hold? As long as the network learns better parameters, won’t the cross entropy cost decrease on every iteration of Gradient Descent? Thanks in advance.
Dropout does not change the cost function, and you do not need to make changes to the cost function when using dropout.
The reasoning is that dropout is a way to average over an ensemble of each of the exponentially-many "thinned" networks resulting from dropping units randomly. In this light, each time you apply dropout and compute the loss, you're computing the loss that corresponds to a randomly-selected thinned network; collecting together many of these losses reflects a distribution of losses over these networks. Of course, the loss surface is noisier as a result, so model training takes longer. The goal of training the network in this way is to obtain a model that is averaged over all of these different "thinned" networks.
For more information, see How to explain dropout regularization in simple terms? or the original paper: Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov, "Dropout: A Simple Way to Prevent Neural Networks from Overfitting", Journal of Machine Learning Research, 2014.
Answered by Sycorax on January 3, 2022
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP