Cross Validated Asked on December 18, 2021
For instance, the default activation function of tf.keras.layers.SimpleRNN
is tanh.
My doubt is because tanh activation functions may also cause (like sigmoids) the vanishing gradient problem.
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP