keras-tqdm
keras-tqdm copied to clipboard
AttributeError: 'TQDMCallback' object has no attribute 'on_train_batch_begin'
I get
AttributeError: 'TQDMCallback' object has no attribute 'on_train_batch_begin'
when trying to use keras-tqdm with LSTM.
Getting that too. Any luck?
Nope, no luck. Sorry, misclicked.
It seems that the library is just outdated. Even when I tried to fix it, it doesn't seem to work. I will look for an updated solution for the problem that got me here.
Solved-ish by creating a progress bar callback, tho I'd love to get an actual fix.. Here's my code if you're still looking:
import time
import sys
import tensorflow as tf
import math
class progBarCallback(tf.keras.callbacks.Callback):
def setEpochSize(self, es):
self.epoch_size = es
def on_epoch_begin(self, epoch, logs={}):
self.done = 0
self.cur_epoch = epoch
printProgBar(epoch, 0, self.epoch_size)
def on_batch_end(self, batch, logs={}):
self.done += self.params["batch_size"]
printProgBar(self.cur_epoch, self.done, self.epoch_size)
def printProgBar(batch, amount, biggest, args=[]):
fraction = amount / biggest
num = math.floor(fraction * 50)
print("\rEpoch " + str(batch + 1) + " [", end="", flush=True)
for i in range(num - 1):
print("=", end="")
if(num > 0):
print(">", end="")
for i in range(50 - num):
print("_", end="")
print("]", end="")
for arg in args:
print(" " + str(arg), end="")
print(" " + str(amount) + "/" + str(biggest), end="")
Example Usage:
progBar = progbar.progBarCallback()
progBar.setEpochSize(len(x_train))
model.fit(x_train, y_train,
epochs=1,
batch_size=1000,
verbose=0,
callbacks=[progBar])
Unfortunately, it seems that Keras removed the callback argument from the evaluate function, so this works only for training.
Thanks, very kind of you to share!
Dude, I dunno if you're still into that, but the solution is very stupid. Keras decides if it allows updating the progress bar by checking a function named sys.stdout.isatty() which is false in PyCharm (I hope you are using PyCharm, that's how I got the error), so all I had to do is to go into \Lib\site-packages\keras\utils and add or True
to the if statement. (Line 355 for me, if self._dynamic_display or True
.
No, I'm actually using Jupyter Notebook. Maybe this could help. Thanks a lot!
This is a simple workaround to use until fixed:
from keras_tqdm import TQDMNotebookCallback
# keras, model definition...
cb = TQDMNotebookCallback()
setattr(cb,'on_train_batch_begin',lambda x,y:None)
setattr(cb,'on_train_batch_end',lambda x,y:None)
model.fit(X_train, Y_train, verbose=0, callbacks=[cb])
I also went through the same problem, @gsportelli 's method can temporarily help, thanks!
This is a simple workaround to use until fixed:
from keras_tqdm import TQDMNotebookCallback # keras, model definition... cb = TQDMNotebookCallback() setattr(cb,'on_train_batch_begin',lambda x,y:None) setattr(cb,'on_train_batch_end',lambda x,y:None) model.fit(X_train, Y_train, verbose=0, callbacks=[cb])
This temporal fix does not update the widget for me. A workaround is:
from keras_tqdm import TQDMNotebookCallback
cb = TQDMNotebookCallback()
cb.on_train_batch_begin = cb.on_batch_begin
cb.on_train_batch_end = cb.on_batch_end
model.fit(X_train, Y_train, verbose=0, callbacks=[cb])
Note that for more modern versions of Tensorflow, you may be forced to patch more attributes. I needed to do this, when using Tensorflow 1.14 with tf.keras
having version 2.2.4.
tqdm_cb = TQDMCallback()
tqdm_cb.on_train_batch_begin = tqdm_cb.on_batch_begin
tqdm_cb.on_train_batch_end = tqdm_cb.on_batch_end
tqdm_cb.on_test_begin = lambda y: None
tqdm_cb.on_test_end = lambda y: None
tqdm_cb.on_test_batch_begin = lambda x, y: None
tqdm_cb.on_test_batch_end = lambda x, y: None