Data Science Asked by a n on August 3, 2021
I am beginner in data-science. I am trying to understand this PyTorch code for gradient computation using custom autograd function:
class MyReLU(torch.autograd.Function):
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return x.clamp(min=0)
def backward(ctx, grad_output):
x, = ctx.saved_tensors
grad_x = grad_output.clone()
grad_x[x < 0] = 0
return grad_x
However, I don’t understand this line : grad_x[x < 0] = 0
. Can anyone explain this part?
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP