mindocr
mindocr copied to clipboard
Rectify code of the LayoutLM series models and adjust it to amp_level mode
Thank you for your contribution to the MindOCR repo. Before submitting this PR, please make sure:
- [ ] You have read the Contributing Guidelines on pull requests
- [ ] Your code builds clean without any errors or warnings
- [ ] You are using approved terminology
- [ ] You have added unit tests
Motivation
之前由于amp功能bug,LayoutLM系列模型全部通过手动插入cast算子的方式,实现混合精度训练加速。现经过验证问题已解决,因此整改LayoutLM系列模型代码,调整yaml中控制混合精度参数的方式。
~~use_float16: True~~
Test Plan
(How should this PR be tested? Do you require special setup to run the test or repro the fixed bug?)
Related Issues and PRs
(Is this PR part of a group of changes? Link the other relevant PRs and Issues here. Use https://help.github.com/en/articles/closing-issues-using-keywords for help on GitHub syntax)