Data Science Asked by william007 on September 20, 2020
Say we have a trained image classification model. Theoretically, is it possible to update the model with only a sample without retraining? If not, is there any kinds of active image classification DL model that can allow incremental update?
Yes, that is possible. In transfer learning you have at least three options to handle transferred weights:
The 3rd one is what you intend to do. And of course different options can be applied to different layers. Also see this paper:
The usual transfer learning approach is to train a base network and then copy its first n layers to the first n layers of a target network. The remaining layers of the target network are then randomly initialized and trained toward the target task. One can choose to backpropagate the errors from the new task into the base (copied) features to fine-tune them to the new task, or the transferred feature layers can be left frozen, meaning that they do not change during training on the new task. The choice of whether or not to fine-tune the first n layers of the target network depends on the size of the target dataset and the number of parameters in the first n layers. If the target dataset is small and the number of parameters is large, fine-tuning may result in overfitting, so the features are often left frozen. On the other hand, if the target dataset is large or the number of parameters is small, so that overfitting is not a problem, then the base features can be fine-tuned to the new task to improve performance. Of course, if the target dataset is very large, there would be little need to transfer because the lower level filters could just be learned from scratch on the target dataset.
I have marked the sentence addressing your question bold.
Answered by Sammy on September 20, 2020
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP