TransWikia.com

tuning a convolution neural net, sample size

Data Science Asked on October 4, 2021

I keep reading that convolution neural net (CNN) performs best with lots and lots (100k+) of data. Is there any rule of thumb, or lower limit for data size during the grid search phase?

For example, if I run a CNN with 100 data points, vary just one parameter (say add an extra layer, or increase a filter size), and get better results, can I reasonably expect better results with those parameters during the actual training phase?

One Answer

If you use pre-trained weights, you need significantly lesser data as the initial layers have already learned from a ton of data and you just need to fine tune the later ones.

What you said is not true, you can train on CIFAR10 and get 90%+ and that is not 100k+. It depends on the complexity of the data and how similar the features are. If they are easily Seperable -less data. If the disctintions are harder then the model needs a lot of examples to figure out which of features are seperate.

I would say you could IF you sample was representive of the population.

Answered by Rahul Deora on October 4, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP