Cross Validated Asked by 3michelin on August 5, 2020
Is it OK to combine categorical and continuous features into the same vector for training deep neural networks? Say there is a categorical feature and continuous feature that I want to feed into a deep neural net at the same time. Is this the way to do it?
categorical feature (one-hot encoded) = [0,0,0,1,0]
continuous feature (number) = 8
final feature vector passed into neural network = categorical feature vector CONCATENATE continuous feature = [0,0,0,1,0,8]
Basically, the question is, is it OK to have a one-hot encoding and a continuous feature together in one feature vector?
Yes, that is one typical way of doing it. But, you need to standardize your features so that gradient descent doesn't suffer, and the regularization treats your weights equally. One way is to standardize the numerical features and then concatenate the one-hot vectors, and the other way is standardizing together. As far as I see, there is no consensus over the two.
Correct answer by gunes on August 5, 2020
Yes, this is absolutely standard.
Answered by Sycorax on August 5, 2020
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP