TransWikia.com

Is finetuning from a pretrained model always better than training from scratch?

Data Science Asked by lee kwot sin on April 10, 2021

At the worst case scenario, we could treat the pretrained weights as a random initialization, same as what we would do for training from scratch, right? If that is the case, then wouldn’t it be better to always start with a pretrained model, as the lower layers of weights has already probably learnt general patterns of images that are transferable across all data sets?

My worry is what if the dataset I want to use for finetuning is highly specialized, highly unnatural and very different from the dataset the pretrained model is trained on. Would this still mean finetuning from a pretrained model wouldn’t be the best idea? (e.g. training on X-Ray images instead of natural images for a cifar-1000 pretrained model.)

One Answer

One limitation of using pretrained model is that you are forced to use the same architecture and weights. There could many scenarios where the pretrained architecture is limiting. If you train from scratch, you can define a custom architecture for the specific problem.

Answered by Brian Spiering on April 10, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP