keras icon indicating copy to clipboard operation
keras copied to clipboard

PReLU Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor

Open Arthaj-Octopus opened this issue 2 years ago • 3 comments

Hello everyone, There seems to be an issue with the PReLU activation layer , as it gives the error:

Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor.

Call arguments received by layer "activation" (type Activation):
  • inputs=tf.Tensor(shape=(None, None, 64), dtype=float32)

whenever called. I have attempted with several networks and I always get the same issue, and if I replace it with any other activaiton layer, such as ReLU or eLU, the error occurs

for example:

import tensorflow as tf
from tensorflow.keras.layers import Conv1D, BatchNormalization, Activation, Add
from tensorflow.keras.layers import ReLU, PReLU

def resnet_block(inputs, filters, kernel_size, stride):
    # Shortcut connection
    shortcut = inputs
    
    # First convolutional layer
    x = Conv1D(filters, kernel_size, strides=stride, padding='same')(inputs)
    x = BatchNormalization()(x)
    x = Activation(activation=PReLU())(x)
    
    # Second convolutional layer
    x = Conv1D(filters, kernel_size, padding='same')(x)
    x = BatchNormalization()(x)
    
    # Shortcut connection for identity mapping
    if stride > 1 or inputs.shape[-1] != filters:
        shortcut = Conv1D(filters, 1, strides=stride, padding='same')(inputs)
        shortcut = BatchNormalization()(shortcut)
    
    # Add shortcut connection to the main path
    x = Add()([x, shortcut])
    x = PReLU()(x)
    
    return x

# Example usage
input_shape = (None, 1)  # Example input shape
inputs = tf.keras.layers.Input(shape=input_shape)
x = resnet_block(inputs, filters=64, kernel_size=3, stride=1)
model = tf.keras.Model(inputs=inputs, outputs=x)
model.summary()
model.compile(
    optimizer=Adam(),
    loss="mse",
)
model.fit(x_data, y_data, epochs=5, batch_size=16, validation_split=0.11,shuffle=True,verbose=1)

Note that the error occurs if you call it as a layer "Prelu()(x)" or if you try to set it as the activation function of the activation layer. I am using tensorflow 2.14.0

Arthaj-Octopus avatar Sep 28 '23 15:09 Arthaj-Octopus