Credit-Card-Fraud-Detection-using-Autoencoders-in-Keras icon indicating copy to clipboard operation
Credit-Card-Fraud-Detection-using-Autoencoders-in-Keras copied to clipboard

Why are you using relu on the last layer?

Open qbx2 opened this issue 7 years ago • 1 comments

input_layer = Input(shape=(input_dim, ))

encoder = Dense(encoding_dim, activation="tanh", 
                activity_regularizer=regularizers.l1(10e-5))(input_layer)
encoder = Dense(int(encoding_dim / 2), activation="relu")(encoder)

decoder = Dense(int(encoding_dim / 2), activation='tanh')(encoder)
decoder = Dense(input_dim, activation='relu')(decoder)

autoencoder = Model(inputs=input_layer, outputs=decoder)

From the fraud_detection.ipynb, there's model using relu as last layer. However, the csv file contains negative values which relu cannot represent. I think the last layer of decoder should represent the input value. Wouldn't it be an issue?

Thanks.

qbx2 avatar Jan 23 '18 13:01 qbx2

I have the same issue, the construction change in construction may only related to regularization methods, But this may not related to activation choose.

svjack avatar Jan 11 '19 09:01 svjack