TransWikia.com

Trouble finding the derivative of an optimization function

Data Science Asked by Carlos Vázquez Losada on February 22, 2021

How can I derivate the following optimization function?

$$L=sum_{u,i}(y_{u,i}-v_ix_u)^2+lambdaleft(sum_i|v_i|_2^2+sum_u|x_u|_2^2right)$$

I just want to get the equations of the gradient descent method, so I want to get the partial derivative of $L$ with respect to $v$ and the partial derivative of $L$ with respect to $x$. This function is similar to ridge regression function.

I just know how to calculate the first of the term of the function, but I have trouble to derive the second addend of the function:

$$delta v_i^k L=-2sum_i(y_{u,i}-v_i x_u)x_u^k + ¿?$$
$$delta x_i^k L=-2sum_i(y_{u,i}-v_i x_u)v_u^k + ¿?$$

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP