TextClassification-Keras icon indicating copy to clipboard operation
TextClassification-Keras copied to clipboard

TypeError: add_weight() got multiple values for argument 'name'

Open ouening opened this issue 4 years ago • 3 comments

Attention类实现感觉有问题,本人使用TextBiRNN模型来跑,出现以下错误:

Traceback (most recent call last):
  File "train.py", line 86, in <module>
    last_activation='softmax').get_model()
  File "/media/gaoya/disk/Applications/keras/短文本分类/model.py", line 282, in get_model
    x_word = Attention(self.maxlen_word)(x_word)
  File "/media/gaoya/disk/Applications/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py", line 75, in symbolic_fn_wrapper
    return func(*args, **kwargs)
  File "/media/gaoya/disk/Applications/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py", line 463, in __call__
    self.build(unpack_singleton(input_shapes))
  File "/media/gaoya/disk/Applications/keras/短文本分类/model.py", line 231, in build
    constraint=self.b_constraint)
TypeError: add_weight() got multiple values for argument 'name'

请问怎么解决呢

ouening avatar Dec 16 '19 07:12 ouening

这可能是 Keras 升级了 API 导致的,你可以查一下最新的 API,相应的修改一下参数即可

ShawnyXiao avatar Feb 07 '20 09:02 ShawnyXiao

Attention类实现感觉有问题,本人使用TextBiRNN模型来跑,出现以下错误:

Traceback (most recent call last): File "train.py", line 86, in last_activation='softmax').get_model() File "/media/gaoya/disk/Applications/keras/短文本分类/model.py", line 282, in get_model x_word = Attention(self.maxlen_word)(x_word) File "/media/gaoya/disk/Applications/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py", line 75, in symbolic_fn_wrapper return func(*args, **kwargs) File "/media/gaoya/disk/Applications/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py", line 463, in call self.build(unpack_singleton(input_shapes)) File "/media/gaoya/disk/Applications/keras/短文本分类/model.py", line 231, in build constraint=self.b_constraint) TypeError: add_weight() got multiple values for argument 'name' 请问怎么解决呢

self.W = self.add_weight(shape=(input_shape[-1], input_shape[-1],), initializer=self.init, name='{}_W'.format(self.name), regularizer=self.W_regularizer, constraint=self.W_constraint) if self.bias: self.b = self.add_weight(shape=(input_shape[-1],), initializer='zero', name='{}_b'.format(self.name), regularizer=self.b_regularizer, constraint=self.b_constraint) 第一个参数加上shape =

zhouzuguang avatar Mar 04 '20 12:03 zhouzuguang

请问我试着输出了attention中与x相乘的那个a的值,但发现权值是我序列长度的平均只。比如我的输入每句话有23个词,然后每个权值都是1/23。。请问有遇到过类似的情况吗,或者这样的情况的原因可能是什么呢。。刚入门。。求解答,谢谢大佬~

Trevor4Y avatar May 18 '20 06:05 Trevor4Y