TransWikia.com

Loss function returns x whereas tensorflow shows validation loss as (x+0.0567)

Data Science Asked on December 2, 2020

I have a custom loss function. In order to experiement how the loss is calculated during valiation, I update the loss function as follows:

def custom_loss(y_true, y_pred):
    return 0 * tf.reduce_mean(input_tensor=-tf.reduce_sum(input_tensor=y_true_1 * tf.math.log(y_pred_1), axis=-1)) + 1 
  • Multiply by 0, to make the data dependent loss value as always 0.
  • Add a constant value like 1 to it, so that the loss function always returns a value of 1.

Now when I run model.evaluate on my data, tensorflow shows me the loss value as 1.0567.

To further experiment, I updated the constant value being returned by loss function from 1 to other constant values. Below you can see the table:

  • constant 0 => loss 0.0567
  • constant 1 => loss 1.0567
  • constant 2 => loss 2.0567

I was expecting the final calculated loss to be same as the constant value I passed. But that’s not the case. What is the reason for a constant difference of 0.0567?

I have verified it across three versions of tensorflow: 1.14.0, 1.15.0, and 2.1.0.

One Answer

On further investigation was able to narrow it down.

The loss value is not only calculated by the response from loss function, but also the various regularizers added in the intermediate layers.

And one of the FC layers in my model was indeed having a L2 regularization term as below:

model.add(layers.Dense(1024, activation='relu', kernel_regularizer=regularizers.l2(0.1))

If I remove the l2 regularization, the loss goes indeed goes to zero if the loss function returns a zero.

Answered by Vishal on December 2, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP