Traceback (most recent call last):
File "headline.py", line 255, in
tf.app.run()
File "/home/lyy/anaconda3/lib/python3.6/site-packages/tensorflow/python/platform/app.py", line 125, in run
_sys.exit(main(argv))
File "headline.py", line 252, in main
train()
File "headline.py", line 173, in train
model = create_model(sess, False)
File "headline.py", line 147, in create_model
forward_only=forward_only)
File "/data/NLP_projects/textsum_projects/seq2seq-chinese-textsum/seq2seq_model.py", line 180, in init
softmax_loss_function=softmax_loss_function)
File "/home/lyy/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py", line 1215, in model_with_buckets
decoder_inputs[:bucket[1]])
File "/data/NLP_projects/textsum_projects/seq2seq-chinese-textsum/seq2seq_model.py", line 179, in
lambda x, y: seq2seq_f(x, y, False),
File "/data/NLP_projects/textsum_projects/seq2seq-chinese-textsum/seq2seq_model.py", line 140, in seq2seq_f
dtype=dtype)
File "/home/lyy/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py", line 857, in embedding_attention_seq2seq
encoder_cell = copy.deepcopy(cell)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 161, in deepcopy
y = copier(memo)
File "/home/lyy/anaconda3/lib/python3.6/site-packages/tensorflow/python/layers/base.py", line 385, in deepcopy
setattr(result, k, copy.deepcopy(v, memo))
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 215, in _deepcopy_list
append(deepcopy(a, memo))
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 180, in deepcopy
y = _reconstruct(x, memo, *rv)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 280, in _reconstruct
state = deepcopy(state, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 150, in deepcopy
y = copier(x, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 240, in _deepcopy_dict
y[deepcopy(key, memo)] = deepcopy(value, memo)
File "/home/lyy/anaconda3/lib/python3.6/copy.py", line 169, in deepcopy
rv = reductor(4)
TypeError: can't pickle _thread.RLock objects
tf0.10.0与之后的版本embedding_attention_seq2seq有些变化,1.10.0版本不会出现,但之后版本会有这个问题。
如果想用多个backuts,会调用module_with_buckets,重复调用会使deepcopy(cell) 产生这个错误
修改\yourpath\tensorflow\contrib\legacy_seq2seq\python\ops\seq2seq.py:
我出现了同样的错误,最后是修改了
packages/tensorflow/contrib/legacy_seq2seq/python/ops/seq2seq.py 这个文件,
最开始添加一行:import tensorflow as tf
(找到程序报错位置)添加:
setattr(tf.contrib.rnn.GRUCell, 'deepcopy', lambda self, _: self)
setattr(tf.contrib.rnn.BasicLSTMCell, 'deepcopy', lambda self, _: self)
setattr(tf.contrib.rnn.MultiRNNCell, 'deepcopy', lambda self, _: self)
参考网址:https://blog.csdn.net/thormas1996/article/details/80743254
遇到类似的问题,把tensorflow改为1.10版本也没有用,还有其他的思路没有啊??


修改\yourpath\tensorflow\contrib\legacy_seq2seq\python\ops\seq2seq.py:
setattr(tf.contrib.rnn.GRUCell, 'deepcopy', lambda self, _: self)
setattr(tf.contrib.rnn.BasicLSTMCell, 'deepcopy', lambda self, _: self)
setattr(tf.contrib.rnn.MultiRNNCell, 'deepcopy', lambda self, _: self)
我也找了很多,发现这个方法对我有效, 希望对你有帮助。
附上问题解决方案出处:
https://blog.csdn.net/thormas1996/article/details/80743254
| |
王亚钒
|
|
[email protected]
|
签名由网易邮箱大师定制
On 3/7/2019 15:19,minda163[email protected] wrote:
遇到类似的问题,把tensorflow改为1.10版本也没有用,还有其他的思路没有啊??
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or mute the thread.