TransWikia.com

Share the experiences of regularization of a LSTM model

Data Science Asked on April 23, 2021

There are five parameters from an LSTM layer for regularization if I am correct.

To deal with overfitting, I would start with

  1. reducing the layers
  2. reducing the hidden units
  3. Applying dropout or regularizers.

There are kernel_regularizer, recurrent_regularizer, bias_regularizer, activity_regularizer, dropout and recurrent_dropout.

They have their definitions on the Keras’s website, but can anyone share more experiences on how to reduce overfitting?

And how are these five parameters used? For example, which parameters are most frequently used and what kind of value should be input?

Is there other effective ways to regularize an LSTM model?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP