I'm having problems when trying to use activations with Keras Functional API. My initial goal was to have choice between relu and leaky relu, so I came up with the following piece of code:
def activation(x, activation_type):
if activation_type == 'leaky_relu':
return activations.relu(x, alpha=0.3)
else:
return activations.get(activation_type)(x)
# building the model
inputs = keras.Input(input_shape, dtype='float32')
x = Conv2D(filters, (3, 3), padding='same')(inputs)
x = activation(x, 'relu')
but something like this gives error: AttributeError: 'Tensor' object has no attribute '_keras_history'
. I found out that it may indicate that my inputs and outputs in Model are not connected.
Is keras.advanced_activations
the only way to achieve functionality like this in functional API?
EDIT: here's the version of activation function that worked:
def activation(self, x):
if self.activation_type == 'leaky_relu':
act = lambda x: activations.relu(x, alpha=0.3)
else:
act = activations.get(self.activation_type)
return layers.Activation(act)(x)
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…