Data Science Asked by DriggyBoy on March 29, 2021
My dataset has about 17000 images belonging to 5 classes. I am using 16000+ images for training(about 3k/class) & 500 for validation(100/class). Training accuracy is very good but validation accuracy doesn’t seem to go beyong 60%. I have about 140k parameters in my model with 100 epochs. Could there be any overfitting happening? Is the dataset too small to train any sort of neural network? I have seen people use transfer learning on smaller datasets so I gave it a try. I am totally new to this being my first project. Any sort of help would be appreciated. I am sorry if I sound completely noob here 🙁
This is how I preprocessed the images for training
from keras.preprocessing.image import ImageDataGenerator
# this is the augmentation configuration we will use for training
train_datagen = ImageDataGenerator(
rescale=1. / 255,
data_format="channels_last")
train_generator = train_datagen.flow_from_directory(
train_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode='categorical',
shuffle=True)
validation_generator = train_datagen.flow_from_directory(
validation_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode='categorical',
shuffle = True)
Here is the output to the above code
Found 16389 images belonging to 5 classes.
Found 500 images belonging to 5 classes.
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP