TransWikia.com

Negative loss, 100% accuracy

Data Science Asked by khair on March 12, 2021

I have a very small dataset with 567 images. I’ve used pretrained model resnet50 in transfer learning. But whenever i fit my model it’s giving me 100% accuracy , 100% validation accuracy and in per epoch loss and validation loss goes to negative. The evaluation shows me output with 100% accuracy and -5915788.088888888 loss.This is preety unexpected result. What’s the matter ? Is it possible or my model is getting bad output ? I have used the model like below :

def Transfer():
    model = Sequential()
    model.add(ResNet50(include_top = False, pooling = 'avg', weights = 'imagenet'))
    model.add(Dropout(0.4))
    model.add(Dense(3, activation = 'softmax'))

    model.layers[0].trainable = False
    sgd = optimizers.SGD(lr = 0.01, decay = 1e-6, momentum = 0.9, nesterov = True)
    model.compile(optimizer = sgd, loss = 'categorical_crossentropy', metrics = ['accuracy'])
return model

Acuracy Graph
Loss Graph

One Answer

categorical_crossentropy should never be negative. The most common reason it becomes negative is incorrect data encoding. It is best if all features and targets are scaled between 0 and 1.

Answered by Brian Spiering on March 12, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP