TransWikia.com

Sharing parameters of an activation across layers of a neural network

Data Science Asked by Astro on November 19, 2020

Keras now provides advanced parametric activation layers like Leaky-ReLU PReLU.
Each time I add this layer to a sequential model, an additional trainable parameter is added to graph.

How can I make sure the trainable parameter of activation is shared across all layers.

Thanks

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP