Training and validation loss plot uses wrong history.history keys
Notebook 3.5-classifying-movie-reviews
The code that is supposed to generate the Training and validation loss side by side uses wrong history.history keys:
acc = history.history['acc']
val_acc = history.history['val_acc']
loss = history.history['loss']
val_loss = history.history['val_loss']
which results in the following errors:
KeyError: 'acc'
KeyError: 'val_acc'
If you examine de history.history keys you can see that they have different names:
history_dict = history.history
history_dict.keys()
dict_keys(['loss', 'val_loss', 'binary_accuracy', 'val_binary_accuracy'])
This is probably due to the use of metrics.binary_accuracy as the metric for evaluating the model. The following change in keys would fix the error:
acc = history.history['binary_accuracy']
val_acc = history.history['val_binary_accuracy']
This is true if you execute the model.compile with function objects instead of the model.compile with strings. Probably adding some text that says to update the keys in the plot would suffice.
Thanks. I was pretty confused when this error came up.
Thanks for pointing it out. I had to add: val_acc = history.history['val_acc'], just below line 3 ('val_acc_values=...') in Listing 3.10. I also added: acc=history.history['acc'] as Line 5 in Listing 3.9 for the code to run.
Hi. I got the key error when I used history.histroy['acc']. So I changed it to history.history['binary_accuracy'] and even that gives me the same error. I am using tensorflow 2.1.0. history.history.keys() return dict_keys(['loss', 'accuracy']). How do I access val_acc and val_loss
@techSash i have he same error if you got a solution please tell me
Hey, for me it was solved by typing 'accuracy' instead of 'acc' , I was told that its a keras version mismatch. @techSash @seifmahmoud
Thanks a lot for the pointers. Appreciated. Have a good day. On 5 Apr 2020 23:12, "Srinath Srinivasan" [email protected] wrote:
Hey, for me it was solved by typing 'accuracy' instead of 'acc' , I was told that its a keras version mismatch. @techSash https://github.com/techSash @seifmahmoud https://github.com/seifmahmoud
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/deep-learning-with-python-notebooks/issues/16#issuecomment-609475193, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGMBYRDKM3SZ45RNYCMMH2TRLDQ33ANCNFSM4EGXIDFA .
Hi. I got the key error when I used
history.histroy['acc']. So I changed it tohistory.history['binary_accuracy']and even that gives me the same error. I am using tensorflow 2.1.0.history.history.keys()returndict_keys(['loss', 'accuracy']). How do I access val_acc and val_loss
I have the same error, is there a fix for this?
Hi. I got the key error when I used
history.histroy['acc']. So I changed it tohistory.history['binary_accuracy']and even that gives me the same error. I am using tensorflow 2.1.0.history.history.keys()returndict_keys(['loss', 'accuracy']). How do I access val_acc and val_lossI have the same error, is there a fix for this?
@thisiseshan try ‘accuracy’ instead of ‘binary_accuracy’
Try this code - it worked for me
Y = labels[['short-term_memorability','long-term_memorability']].values X = one_hot_res; X_train, X_test, Y_train, Y_test = train_test_split(X,Y, test_size=0.2, random_state=42)
add dropout
add regularizers
model = Sequential() model.add(layers.Dense(200,activation='relu',kernel_regularizer=None,input_shape=(len_token,))) model.add(layers.Dropout(0.1)) model.add(layers.Dense(2,activation='sigmoid'))
compile the model
model.compile(optimizer='rmsprop',loss=['mse'],metrics=['accuracy'])
training the model
history = model.fit(X_train,Y_train,epochs=20,validation_data=(X_test,Y_test))
plt.subplot(211) plt.plot(history.history['accuracy']) plt.plot(history.history['val_accuracy']) plt.title('Model Accuracy') plt.ylabel('Accuracy') plt.xlabel('Epoch') plt.legend(['Training', 'Validation'], loc='lower right')
summarize history for loss
plt.subplot(212) plt.plot(history.history['loss']) plt.plot(history.history['val_loss']) plt.title('Model Loss') plt.ylabel('Loss') plt.xlabel('Epoch') plt.legend(['Training', 'Validation'], loc='upper right')
plt.tight_layout()
plt.show()
Hi. I got the key error when I used
history.histroy['acc']. So I changed it tohistory.history['binary_accuracy']and even that gives me the same error. I am using tensorflow 2.1.0.history.history.keys()returndict_keys(['loss', 'accuracy']). How do I access val_acc and val_lossI have the same error, is there a fix for this?
@thisiseshan try ‘accuracy’ instead of ‘binary_accuracy’
thanks a lot. This worked for me.
In model.compile(), you shall specify the metrics as:
model.compile(loss="categorical_crossentropy", optimizer=sgd, metrics=["accuracy"])
And then,
The correct working code is:
H.history["accuracy"] and H.history["val_accuracy"]
( Previously I was using H.history["acc"] and H.history["val_acc"] , which was giving KeyError: 'acc' )
LOL it is very simple just use print(history.history) and use the output accordingly, I wonder why this issue is still open.
It is still open because there are lots of us and new students who are struggling with coding (unlike you). Do not follow the thread if it is a waste of your precious time.
On Sat, Aug 22, 2020 at 10:41 PM Aryan Choudhary [email protected] wrote:
LOL it is very simple just use print(history.history) and use the output accordingly, I wonder why this issue is still open.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/deep-learning-with-python-notebooks/issues/16#issuecomment-678691700, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGMBYRAUF62QJ4IGIEHCBULSCAUQVANCNFSM4EGXIDFA .
bro the issue is solved above that's why I was saying that .... people like without getting the start commenting without understanding the context so watch out before u say anything..... I myself have put a solution there and numerous others also have shared the solution.
It is still open because there are lots of us and new students who are struggling with coding (unlike you). Do not follow the thread if it is a waste of your precious time. … On Sat, Aug 22, 2020 at 10:41 PM Aryan Choudhary @.***> wrote: LOL it is very simple just use print(history.history) and use the output accordingly, I wonder why this issue is still open. — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#16 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGMBYRAUF62QJ4IGIEHCBULSCAUQVANCNFSM4EGXIDFA .
Watch out for what? On 4 Sep 2020 17:02, "Aryan Choudhary" [email protected] wrote:
bro the issue is solved above that's why I was saying that .... people like without getting the start commenting without understanding the context so watch out before u say anything..... I myself have put a solution there and numerous others also have shared the solution.
It is still open because there are lots of us and new students who are struggling with coding (unlike you). Do not follow the thread if it is a waste of your precious time. … <#m_-8532513489428260905_> On Sat, Aug 22, 2020 at 10:41 PM Aryan Choudhary @.***> wrote: LOL it is very simple just use print(history.history) and use the output accordingly, I wonder why this issue is still open. — You are receiving this because you commented. Reply to this email directly, view it on GitHub <#16 (comment) https://github.com/fchollet/deep-learning-with-python-notebooks/issues/16#issuecomment-678691700>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ AGMBYRAUF62QJ4IGIEHCBULSCAUQVANCNFSM4EGXIDFA .
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/fchollet/deep-learning-with-python-notebooks/issues/16#issuecomment-687204207, or unsubscribe https://github.com/notifications/unsubscribe-auth/AGMBYRDGYUFAYOQFXLHVB43SED6QNANCNFSM4EGXIDFA .
al imprimir me sale vacio dict_keys([ ]) ayuda por favor