Data Science Asked by molo32 on January 6, 2021
I was reading about descending gradient.
How does the descending gradient know what weights to adjust?
Does it adjust to all network weights at the same time?
Does each weight have an associated error?
In general (or at least the basic implementation for the gradient descent), you apply for each iteration the update rule for each weight, using the partial derivative of the loss function with respect to that weight, as follows:
Answered by German C M on January 6, 2021
All the weights are updated while backpropagation in the gradient decent algorithm. The amount of weight update is determined by it gradient and the learning rate.
Answered by mujjiga on January 6, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP