Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There's no padding_mask in TransformerBlock #5

Open
nzinfo opened this issue Apr 10, 2019 · 1 comment
Open

There's no padding_mask in TransformerBlock #5

nzinfo opened this issue Apr 10, 2019 · 1 comment

Comments

@nzinfo
Copy link

nzinfo commented Apr 10, 2019

When I repeat the training process, and got an error

Using TensorFlow backend.
mask is  Tensor("lambda_1/MatMul:0", shape=(?, 150, 150), dtype=float32)
Traceback (most recent call last):
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 321, in get_or_create
    TFSegmenter.__singleton = TFSegmenter(**config)
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 118, in __init__
    self.model, self.parallel_model = self.__build_model()
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 134, in __build_model
    enc_output = self.__encoder(emb_output, mask)
  File "/data/nlp/transformer-word-segmenter/tf_segmenter/__init__.py", line 178, in __encoder
    next_step_input = transformer_enc_layer(next_step_input, padding_mask=mask)
TypeError: __call__() got an unexpected keyword argument 'padding_mask'

I was using https://github.com/kpot/keras-transformer , but no keyword padding_mask found.

Is there an internal version of keras-transformer ?

@GlassyWing
Copy link
Owner

Yeah, I forked it as https://github.com/GlassyWing/keras-transformer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants