deep-steg icon indicating copy to clipboard operation
deep-steg copied to clipboard

Running the script gives an error of 'list' object is not callable

Open tarik-amaterasu opened this issue 3 years ago • 1 comments

the script used to run correctly, when i checked again lately i got this error TypeError: 'list' object is not callable on line

ae_loss.append(autoencoder_model.train_on_batch(x=[batch_S, batch_C],
                                                   y=np.concatenate((batch_S, batch_C),axis=3)))

in training section

NB_EPOCHS = 1000
BATCH_SIZE = 32

m = input_S.shape[0]
loss_history = []
for epoch in range(NB_EPOCHS):
    np.random.shuffle(input_S)
    np.random.shuffle(input_C)
    
    t = tqdm(range(0, input_S.shape[0], BATCH_SIZE),mininterval=0)
    ae_loss = []
    rev_loss = []
    for idx in t:
        
        batch_S = input_S[idx:min(idx + BATCH_SIZE, m)]
        batch_C = input_C[idx:min(idx + BATCH_SIZE, m)]
        
        C_prime = encoder_model.predict([batch_S, batch_C])
        
        ae_loss.append(autoencoder_model.train_on_batch(x=[batch_S, batch_C],
                                                   y=np.concatenate((batch_S, batch_C),axis=3)))
        rev_loss.append(reveal_model.train_on_batch(x=C_prime,
                                              y=batch_S))
        
        # Update learning rate
        K.set_value(autoencoder_model.optimizer.lr, lr_schedule(epoch))
        K.set_value(reveal_model.optimizer.lr, lr_schedule(epoch))
        
        t.set_description('Epoch {} | Batch: {:3} of {}. Loss AE {:10.2f} | Loss Rev {:10.2f}'.format(epoch + 1, idx, m, np.mean(ae_loss), np.mean(rev_loss)))
    loss_history.append(np.mean(ae_loss))

im not sure what's wrong with it

tarik-amaterasu avatar Aug 29 '20 00:08 tarik-amaterasu

Even after going through the entire code of autoencoder_model I wasn't able to figure out the problem, but then I tried the following things as the error log displayed that there was a problem in calling the rev_loss function from full_loss definition:

  1. Changing s_true, c_true = y_true[...,0:3], y_true[...,3:6] to s_true, c_true = y_true[:,:,:,0:3], y_true[:,:,:,3:6]. Similarly changing s_pred, c_pred = y_pred[...,0:3], y_pred[...,3:6] to s_pred, c_pred = y_pred[:,:,:,0:3], y_pred[:,:,:,3:6] in the full_loss function
  2. In the full_loss function try changing s_loss = rev_loss(s_true, s_pred) to s_loss = beta * K.sum(K.square(s_true - s_pred))

Hope that helps 😄

tash149 avatar Dec 20 '20 06:12 tash149