Data Science Asked by Arwen on March 10, 2021
In Tensorflow documentation, it is shown how to tune several hyperparameters but not the learning rate.I have searched how to tune learning rate using HParams dashboard but could not find much. The only example is another question on github but it does not work.Can you please give me some suggestions on this?Should I use a callback function?Or provide different learning rates in hp_optimizer as in the question in github? Or something else?
Parts of my code is below:
HP_NUM_UNITS = hp.HParam('num_units', hp.Discrete([16, 32]))
HP_OPTIMIZER = hp.HParam('optimizer', hp.Discrete(['adam', 'sgd']))
HP_L_RATE= hp.HParam('learning_rate', hp.Discrete([0.0005, 0.001]))
def train_model(hparams):
model= tf.keras.models.Sequential([
tf.keras.layers.InputLayer(input_shape=(None,43), dtype=tf.float64),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(hparams[HP_NUM_UNITS], return_sequences=True)),
tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(hparams[HP_NUM_UNITS], activation='relu')),
tf.keras.layers.Dense(1)])
model.compile(optimizer=hparams[HP_OPTIMIZER],loss='mae',metrics=['accuracy'])
callback = tf.keras.callbacks.LearningRateScheduler(hparams[HP_L_RATE])
model.fit(train_dataset, epochs=100,callbacks=[callback]) #, validation_data=test_dataset
_,loss=model.evaluate(test_dataset)
return loss
Problem is I cannot figure out how and where to insert hparams[HP_L_RATE] in the train_model().As you can see, I have tried to use a callback function to implement hparams[HP_L_RATE], but it does not work.
Thank you,
I guess the error in the example is that, instead of numbers objects were passed to HParams. You should handle the learning rate just as any other float parameter and just pass it to the optimizer. Actually hard to tell what you want exactly without a code example.
Answered by MichaelRazum on March 10, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP