fairseq
fairseq copied to clipboard
Can fairseq save the model parameters as fp16?
❓ Questions and Help
Before asking:
- search the issues.
- search the docs.
What is your question?
I noticed fairseq can train with fp16 but the saved model parameters are still fp32. But it seems fp32 model will have a larger accuracy loss when using dynamic quant from pytorch. Can fairseq save model parameters as fp16 directly?
Code
What have you tried?
What's your environment?
- fairseq Version (e.g., 1.0 or main):
- PyTorch Version (e.g., 1.0)
- OS (e.g., Linux):
- How you installed fairseq (
pip
, source): - Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information: