transformer-word-segmenter
transformer-word-segmenter copied to clipboard
There's no padding_mask in TransformerBlock
When I repeat the training process, and got an error
Using TensorFlow backend.
mask is Tensor("lambda_1/MatMul:0", shape=(?, 150, 150), dtype=float32)
Traceback (most recent call last):
File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 321, in get_or_create
TFSegmenter.__singleton = TFSegmenter(**config)
File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 118, in __init__
self.model, self.parallel_model = self.__build_model()
File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 134, in __build_model
enc_output = self.__encoder(emb_output, mask)
File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 178, in __encoder
next_step_input = transformer_enc_layer(next_step_input, padding_mask=mask)
TypeError: __call__() got an unexpected keyword argument 'padding_mask'
I was using https://github.com/kpot/keras-transformer , but no keyword padding_mask
found.
Is there an internal version of keras-transformer ?
Yeah, I forked it as https://github.com/GlassyWing/keras-transformer