bert
bert copied to clipboard
module 'tensorflow_core._api.v2.train' has no attribute 'Optimizer'
Traceback (most recent call last):
File "run_classifier.py", line 25, in
I met the same problem with tensorflow 1.15.0.
same issue here!
Same issue. My tensorflow-gpu version is 2.0.0
same issue My tensorflow version is 1.11.0
It became a warning instead with the 1.14 version.
Same issue with Tensorflow cpu 2.0.0 on mac !
Same issue. My tensorflow cpu version is 2.0.0 .
change tensorflow version to 1.14
change class AdamWeightDecayOptimizer(tf.train.Optimizer) to class AdamWeightDecayOptimizer(tf.compat.v1.train.Optimizer)
This is going to become more prevalent as more users switch to TensorFlow 2. I would suggest that the developers start a new branch for TensorFlow 2 development. As more people switch, that branch can eventually become master. Might want to tag a final release based on TensorFlow 1 or something before cutting over.
I am experiencing this as I am trying to execute a model in a new environment using Tensorflow 2.0, and GPU. Probably need to figure out other migration and upgrading approach!.
Francois Chollet: Keras v2.3.0 is the first release of Keras that brings keras in sync with tf.keras
It will be the the last major release to support backends other than TensorFlow (i.e., Theano, CNTK, etc.)
And most importantly, deep learning practitioners should start moving to TensorFlow 2.0 and the tf.keras package
People who are starting new projects are going to use tensorflow 2, is there a place or way to get this working with tf2. Probably a beta version which works with tf2 without errors
Someone have a solution?
I'm trying to follows thi tutorial:
https://cloud.google.com/tpu/docs/tutorials/bert?authuser=1
There is a now BERT 2.x tutorial (runs TF 2.1) at https://cloud.google.com/tpu/docs/tutorials.
That's good I guess, but, unless I'm missing something (which I totally could be) it really doesn't solve the issue that the code in this repo won't work with TF 2 right?
Right, this code needs TF 1.15. I was responding to girottoma's comment above. In https://github.com/google-research/bert/blob/master/README.md, there is a section "Fine-tuning with Cloud TPUs" that references the Cloud TPU BERT tutorial (there are now 2 of them) and a BERT colab at https://cloud.google.com/tpu/docs/tutorials/. The BERT 2.x tutorial uses TF 2.1 and the BERT 1.x tutorial uses TF 1.1.5 (this tutorial was broken, but will be fixed and republished next week).
Haha! My fault. I knew I was missing something. 🤤
Thanks for clarification.
There was a problem with the TF 1.15 BERT tutorial (https://cloud.google.com/tpu/docs/tutorials/bert) that was causing it to fail. It has been fixed and the BERT 1.x tutorial should run correctly now.
Hey guys,i think i got the same issue cause i'm using tensorflow-gpu 1.15.0 and tensorflow 1.15.0 but when i train into the dataset of SQUAD i got INFO:tensorflow:impossible example
Anyone has a clue ?
same issue using google colab any solution ????
I fixed the issue in google colab by installing tensorflow 1.15 instead of 2. I get a warning only.
!pip install tensorflow-gpu==1.15.0
I am also facing the same issue --
AttributeError Traceback (most recent call last)
~\bert\run_classifier.py in
~\bert\optimization.py in
AttributeError: module 'tensorflow._api.v2.train' has no attribute 'Optimizer'
same issue using google colab any solution ????
Hi, Did you get a reply on how to solve this issue?
I am also facing the same issue --
AttributeError Traceback (most recent call last) in 1 import bert ----> 2 from bert import run_classifier 3 from bert import optimization 4 from bert import tokenization 5 from tqdm import tqdm
~\bert\run_classifier.py in 23 import os 24 from bert import modeling ---> 25 from bert import optimization 26 from bert import tokenization 27 import tensorflow as tf
~\bert\optimization.py in 85 86 ---> 87 class AdamWeightDecayOptimizer(tf.train.Optimizer): 88 """A basic Adam optimizer that includes "correct" L2 weight decay.""" 89
AttributeError: module 'tensorflow._api.v2.train' has no attribute 'Optimizer'
This can be solved by changing tf.train.Optimizer to tf.compat.v1.train.AdamOptimizer()
just in case this helps someone: one thing that has helped me solve this issue is to ensure my python==2.x (note: not 3.x which is default these days), tensorflow==1.15.0
I am also facing the same issue --
AttributeError Traceback (most recent call last) in 1 import bert ----> 2 from bert import run_classifier 3 from bert import optimization 4 from bert import tokenization 5 from tqdm import tqdm ~\bert\run_classifier.py in 23 import os 24 from bert import modeling ---> 25 from bert import optimization 26 from bert import tokenization 27 import tensorflow as tf ~\bert\optimization.py in 85 86 ---> 87 class AdamWeightDecayOptimizer(tf.train.Optimizer): 88 """A basic Adam optimizer that includes "correct" L2 weight decay.""" 89 AttributeError: module 'tensorflow._api.v2.train' has no attribute 'Optimizer'
This can be solved by changing tf.train.Optimizer to tf.compat.v1.train.AdamOptimizer()
Got the error below after making the above changes.
Traceback (most recent call last):
File "run_classifier.py", line 29, in
Triggered a bunch of other function errors after changing flags = tf.flags to tf.compat.v1.flags. So went ahead made the following global change, which is probably equivalent to setting tf version to 1.15.0.
tf = tensorflow.compat.v1
Then the following error occurred. (The same error happens when setting tf version to 1.15.)
AttributeError: module 'bert.tokenization' has no attribute 'validate_case_matches_checkpoint'
Seems bert.tokenization does have this function defined though. Any thoughts how to fix?
change class AdamWeightDecayOptimizer(tf.train.Optimizer) to class AdamWeightDecayOptimizer(tf.compat.v1.train.Optimizer)
this could work!!
I am also facing the same issue --
AttributeError Traceback (most recent call last) in 1 import bert ----> 2 from bert import run_classifier 3 from bert import optimization 4 from bert import tokenization 5 from tqdm import tqdm ~\bert\run_classifier.py in 23 import os 24 from bert import modeling ---> 25 from bert import optimization 26 from bert import tokenization 27 import tensorflow as tf ~\bert\optimization.py in 85 86 ---> 87 class AdamWeightDecayOptimizer(tf.train.Optimizer): 88 """A basic Adam optimizer that includes "correct" L2 weight decay.""" 89 AttributeError: module 'tensorflow._api.v2.train' has no attribute 'Optimizer'
This can be solved by changing tf.train.Optimizer to tf.compat.v1.train.AdamOptimizer()
Got the error below after making the above changes.
Traceback (most recent call last): File "run_classifier.py", line 29, in flags = tf.flags AttributeError: module 'tensorflow' has no attribute 'flags'
Triggered a bunch of other function errors after changing flags = tf.flags to tf.compat.v1.flags. So went ahead made the following global change, which is probably equivalent to setting tf version to 1.15.0.
tf = tensorflow.compat.v1
Then the following error occurred. (The same error happens when setting tf version to 1.15.)
AttributeError: module 'bert.tokenization' has no attribute 'validate_case_matches_checkpoint'
Seems bert.tokenization does have this function defined though. Any thoughts how to fix?
I have same problem
class AdamWeightDecayOptimizer(tf.compat.v1.train.Optimizer):
Just change the code from
optimizer=tf.train.Adamoptimizer()
to
optimizer='adam'