Data Science Asked on August 28, 2021
I have a problem of a task using the formula of the Gradient Descent:
Perform two steps of the gradient descent towards a local minimum for the function given below, using a step size of 0.1 and an initial value of [1, 1]
I only get as result of the derivative 0.4×1, because x2 does not exist.
Is this correct or should the result for the derivative be (0.4, 0)?
Note: Sorry, if my equation of the derivative is bad. I’m not a mathematician. Please, correct me, if my equation is absolutely wrong.
Gradient, g(x) = 0.4*x
At [x1 = 1], Gradient,
g(1) = 0.4
x2 = x1- step*gradient
=>x2 = 1 - 0.1*0.4
=>x2 = 0.96
At [x2 = 0.96], Gradient,
g(0.96) = 0.4*0.96 = 0.384
=> x3 = 0.96 - 0.1*0.384 = 0.9216
Continue following the same steps and will reach near the minima.
Correct answer by 10xAI on August 28, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP