TransWikia.com

what would happen in max_pool layer if backprop would add gradient to all inputs for particular neuron but only if it's positive

Data Science Asked on May 21, 2021

in max_pool layer ANN performs this operation max([in1, in2, ... inN]), now if gradient that comes back to this layer is negative sure it should affect only max value, but if gradient is positive shouldn’t it affect all values (after all this layer performs max operation so work of gradient shouldn’t be symmetrical) because if max value is too low then all other input values for this neuron are also too low

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP