TransWikia.com

Should I set higher dropout prob if there are plenty of data?

Data Science Asked on August 28, 2021

I have some excessive amount of data for the size of NN I am able to teach in a reasonable time.

If I feed all the data into the network it stops learning at some point and a resulting model shows all signs of being overfit.

Intuitively if I increase dropout prob the model should learn less aggressively from data and gain from more data being fed into it.

Is my logic sound?

One Answer

You are right, increasing the dropout proportion will help.

However, this looks like a setting where early stopping will be a very good choice. Although dropout, weight decay and batch-norm can work propperly, the fact that you easily overfit your training set would make it an appropiate scenario to try early stopping.

In addition, as the neural network takes very short to be trained you can train many of them (on some subset of the training set, making them weak learners) and create an ensemble to make the final predictions.

Answered by David Masip on August 28, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP