TransWikia.com

What is the relationship between the accuracy and the loss in deep learning?

Data Science Asked by N.IT on January 19, 2021

I have created three different models using deep learning for multi-class classification and each model gave me a different accuracy and loss value. The results of the testing model as the following:

  • First Model: Accuracy: 98.1% Loss: 0.1882

  • Second Model: Accuracy: 98.5% Loss: 0.0997

  • Third Model: Accuracy: 99.1% Loss: 0.2544

My questions are:

  • What is the relationship between the loss and accuracy values?

  • Why the loss of the third model is the higher even though the accuracy is higher?

3 Answers

There is no relationship between these two metrics.
Loss can be seen as a distance between the true values of the problem and the values predicted by the model. Greater the loss is, more huge is the errors you made on the data.

Accuracy can be seen as the number of error you made on the data.

That means :
- a low accuracy and huge loss means you made huge errors on a lot of data
- a low accuracy but low loss means you made little errors on a lot of data
- a great accuracy with low loss means you made low errors on a few data (best case)
- your situation : a great accuracy but a huge loss, means you made huge errors on a few data.

For you case, the third model can correctly predict more examples, but on those where it was wrong, it made more errors (the distance between true value and predicted values is more huge).

NOTE :

Don't forget that low or huge loss is a subjective metric, which depends on the problem and the data. It's a distance between the true value of the prediction, and the prediction made by the model. It depends also on the loss you use.

Think :
- If your data are between 0 and 1, a loss of 0.5 is huge, but if your data are between 0 and 255, an error of 0.5 is low.
- Maybe think of cancer detection, and probability of detecting a cancer. Maybe an error of 0.1 is huge for this problem, whereas an error f 0.1 for image classification is fine.

Answered by Jérémy Blain on January 19, 2021

Actually, accuracy is a metric that can be applied to classification tasks only. It describes just what percentage of your test data are classified correctly. For example, you have binary classification cat or non-cats. If out of 100 test samples 95 is classified correctly (i.e. correctly determined if there's cat on the picture or not), then your accuracy is 95%. By the way, Confusion matrix describes your model much better then accuracy.

Loss depends on how you predict classes for your classification problem. For example, your model use probabilities to predict binary class cat or non-cats between 1 and 0. So if probability of cat is 0.6, then the probability of non-cat is 0.4. In this case, picture is classified as cat. Loss will be sum of the difference between predicted probability of the real class of the test picture and 1. In reality log loss is used for binary classification, I just gave the idea of what loss is.

Answered by DmytroSytro on January 19, 2021

The other answers give good definitions of accuracy and loss. To answer your second question, consider this example:

We have a problem of classifying images from a balanced dataset as containing either cats or dogs. Classifier 1 gives the right answer in 80/100 of cases, whereas classifier 2 gets it right in 95/100. Here, classifier 2 obviously has the higher accuracy.

However, in the 80 of images classifier 1 gets right, it is extremely confident (for instance when it thinks an image is of a cat it is 100% sure that's the case), and in the 20 it gets wrong it was not at all confident (e.g. when it said a cat image contained a dog it was only 51% sure about that). In comparison, classifier 2 is extremely confident in its 5 wrong answers (it's 100% convinced that an image which actually shows a dog is a cat), and was not very confident about the 95 it got right. In this case, classifier 2 would have worse loss.

Answered by rlms on January 19, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP