transformer-word-segmenter icon indicating copy to clipboard operation
transformer-word-segmenter copied to clipboard

There's no padding_mask in TransformerBlock

Open nzinfo opened this issue 5 years ago • 1 comments

When I repeat the training process, and got an error

Using TensorFlow backend.
mask is  Tensor("lambda_1/MatMul:0", shape=(?, 150, 150), dtype=float32)
Traceback (most recent call last):
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 321, in get_or_create
    TFSegmenter.__singleton = TFSegmenter(**config)
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 118, in __init__
    self.model, self.parallel_model = self.__build_model()
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 134, in __build_model
    enc_output = self.__encoder(emb_output, mask)
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 178, in __encoder
    next_step_input = transformer_enc_layer(next_step_input, padding_mask=mask)
TypeError: __call__() got an unexpected keyword argument 'padding_mask'

I was using https://github.com/kpot/keras-transformer , but no keyword padding_mask found.

Is there an internal version of keras-transformer ?

nzinfo avatar Apr 10 '19 09:04 nzinfo

Yeah, I forked it as https://github.com/GlassyWing/keras-transformer

GlassyWing avatar Apr 10 '19 12:04 GlassyWing