TransWikia.com

How to calculate accuracy, precision and recall, and F1 score for a keras sequential model?

Data Science Asked by user85181 on August 12, 2020

I want to calculate accuracy, precision and recall, and F1 score for multi-class classification problem. I am using these lines of code mentioned below.

from keras import backend as K
def precision(y_true, y_pred, average='None'):
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
    predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
    precision = true_positives / (predicted_positives + K.epsilon())
    return precision

def recall(y_true, y_pred, average='micro'):
    true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
    possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
    recall = true_positives / (possible_positives + K.epsilon())
    return recall

def f1(y_true, y_pred, average='weighted'):
    def recall(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
        recall = true_positives / (possible_positives + K.epsilon())
        return recall

    def precision(y_true, y_pred):
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
        precision = true_positives / (predicted_positives + K.epsilon())
        return precision

    precision = precision(y_true, y_pred)
    recall = recall(y_true, y_pred)
    return 2 * ((precision * recall) / (precision + recall + K.epsilon()))

Don’t know why the value of recall is 1 for both testing and training.
Please help me to calculate accuracy, precision and recall, and F1 score for multi-class classification using the Keras model.

One Answer

if you want to compute this for every class then:

def recall(y_true, y_pred,class_to_analyse):
     pred = K.argmax(y_pred)
     true = K.argmax(y_true)
     p = K.cast(K.equal(pred,class_to_analyse),'int32')
     t = K.cast(K.equal(true,class_to_analyse),'int32')
     # Compute the true positive
     common = K.sum(K.dot(K.reshape(t,(1,-1)),K.reshape(p,(-1,1))))
     # divide by all positives in t
     recall = common/ (K.sum(t) + K.epsilon)
     return recall

def precision(y_true, y_pred,class_to_analyse):
     pred = K.argmax(y_pred)
     true = K.argmax(y_true)
     p = K.cast(K.equal(pred,class_to_analyse),'int32')
     t = K.cast(K.equal(true,class_to_analyse),'int32')
     # Compute the true positive
     common = K.sum(K.dot(K.reshape(t,(1,-1)),K.reshape(p,(-1,1))))
     # divide by all positives in t
     precision = common/ (K.sum(p) + K.epsilon)
     return precision

def fbeta(y_true, y_pred,class_to_analyse):
    beta = 1 # for f1 score 
    precision = precision(y_true, y_pred,class_to_analyse)
    recall = recall(y_true, y_pred,class_to_analyse)

    beta_squared = beta ** 2
    return (beta_squared + 1) * (precision * recall) / (beta_squared * precision + recall)

to make it work you need to create a list of function with a fixed class (ex: recall_1(y_true, y_pred) = recall(y_true, y_pred,class_to_analyse = 1)

Answered by Frayal on August 12, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP