fairseq
fairseq copied to clipboard
Fix Torchscript typing in transformer_encoder.py
Before submitting
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [ ] Did you read the contributor guideline?
- [ ] Did you make sure to update the docs?
- [ ] Did you write any new necessary tests?
What does this PR do?
Fixes # (issue).
PR review
Anyone in the community is free to review the PR once the tests have passed. If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃
@alexeib
I think this should fix the Torchscript compilation error introduced in https://github.com/facebookresearch/fairseq/pull/4818
meanwhile is there recommended test to run in addition to the CI tests?
some tests are still failing
some tests are still failing
@alexeib ok after some investigation, seems these build failures are introduced by PyTorch 1.13.0 version release. (which is on Oct 28)
After I install pytorch version 1.12.1 locally and rerun the tests by pytest
, seems the errors from
tests/test_binaries.py::TestTranslation::test_transformer_pointer_generator
and tests/test_multihead_attention.py::test_xformers_single_forward_parity
are gone.
failures under tests/test_plasma_utils.py::TestPlasmaView seems irrelevant:
OSError: Could not connect to socket /tmp/tmphatl4hub
I guess we could proceed with this PR and fix other issues separately?
thanks for investigating @zhxchen17 . Shall we open another issue for the pytorch test breakage and reference it here?
thanks for investigating @zhxchen17 . Shall we open another issue for the pytorch test breakage and reference it here?
sure lemme open an issue.