TransWikia.com

Neural network output is 0 for test data (using RELU for activation)

Data Science Asked by Pedro Pablo Severin Honorato on December 8, 2020

Maybe this is a naive question, but I have a NN that uses relu for all layers. In train data there is no problem, but in test (or validation) the outputs are all 0. I used MinMaxScaler with a feature range of (0,1) and in the test data there are actually negative values that are given as inputs to the NN. This is because I applied the MinMaxScaler only to the train data and then applied the scaler’s transform function to the test data, resulting in negative values.

  1. Are those negative values the reason because I’m getting 0 as outputs? How can I correct this?
  2. The way to apply the MinMaxScaler is correct, right? Do fit_transform only to train data and just transform to validation data?
  3. Is best practice to use StandarScaler or MinMaxScaler should be the way?

Thank you!

EDIT: If it is useful, what I’m trying is to get as output is a number greater than 0. It is a regression problem.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP