Cross Validated Asked on November 12, 2021
Is it possible to define a function or layer such that there exist two sets of weights and biases? For instance, where a normal sigmoidal activation function is 1/(1+e^-x)
, is it possible to have some layer where the activation function would be e.g. 1/(1+y*e^-x)
with y being the result of the second input weights and bias.
The layer (or activation function) you described could be easily implemented. It takes two input tensors, x
and y
, which have compatible shapes and return as output 1 / (1 + y * e^(-x))
. So using the documentation as our guide, we can implement it like this:
import tensorflow as tf
class MultSigmoid(tf.keras.layers.Layer):
def __init__(self):
super(MultSigmoid, self).__init__()
def call(self, inputs):
x, y = inputs
return 1.0 / (1.0 + y * tf.math.exp(-x))
And now, just for demonstration, we can use it like this:
inp1 = tf.keras.layers.Input(shape=(3,))
inp2 = tf.keras.layers.Input(shape=(5,))
d1 = tf.keras.layers.Dense(6)(inp1)
d2 = tf.keras.layers.Dense(6)(inp2)
out = MultSigmoid()([d1, d2])
Note that in our implementation we assumed that y * e^(-x)
is the element-wise multiplication. If instead, we are interested in tensor (or matrix) multiplication, we can use tf.matmul
like: tf.matmul(y, tf.math.exp(-x))
; however, note that in both cases it's assumed that the shapes are compatible with each other and the multiplication could be done, otherwise an error is raised.
Answered by today on November 12, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP