TransWikia.com

Why does it has a constant val_loss:?

Data Science Asked by DarvinismExplained on May 28, 2021

enter image description here

I am working on somr dataset and am implementing a deep neural network. There are some typos that I am not familiar with. strong text

2 Answers

Change the last layer's Neuron count to 2.

model.add(keras.layers.Dense( 2, activation="softmax"))

OR

Change your last layer's activation to Sigmoid and keep y_train with single column

model.add(keras.layers.Dense( 1, activation="sigmoid"))

Correct answer by 10xAI on May 28, 2021

Since your loss is binary_crossentropy, I presume that you are working with binary data. In that case, your output will be of 2 values. For such a case, using only 1 neuron in the last layer with softmax as the activation is not a good idea.

This is because softmax activation is meant for probability distributions. For example, if your classification is to predict if an image is a dog or a cat and if your dataset labels are one-hot encoded, then the probability distribution would look like this if the output is a cat:

[1. 0.]

If the output is a dog, it would look like this: [0. 1.]

This just means that 1,0 is cat[True],dog[False].

And, 0,1 is cat[False],dog[True]

This depends on your labels and the order of the one-hot encoding.

In order to fix your issue, there are two methods:

1) One-hot encode your labels and change your last layer to 2 neurons with 
   ```activation='softmax'```


2) The other method is to keep the last layer with 1 neuron but to change activation 
   to  ```sigmoid```

Code for the first method:

from keras.utils import to_categorical

y_train = to_categorical(y_train) # One-hot encoding your train labels
y_test = to_categorical(y_test)   # One-hot encoding your test labels

# Now run the model
model = Sequential()
model.add(Dense(11, input_shape=(num_of_features,), activation='relu'))
model.add(Dense(18, activation='relu'))
model.add(Dense(2, activation='softmax')

Code for the second method:

model = Sequential()
model.add(Dense(11, input_shape=(num_of_features,), activation='relu'))
model.add(Dense(18, activation='relu'))
model.add(Dense(1, activation='sigmoid')

I hope this answer is of help to you. If you find this helpful, please upvote.

Answered by Fortfanop on May 28, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP