fairseq
fairseq copied to clipboard
How to convert a fairseq transformer model into ONNX format?
Any documentation or tools? Thanks!
I would like to contribute if someone can guide me
Also interested in being able to do this..
i need it too!tks! i write code to convert onnx but it is not work! `import torch import onnx
en2de = torch.hub.load('pytorch/fairseq', 'transformer.wmt19.en-de.single_model') output = en2de.translate('Hello world', beam=5) #print("This is output:", output) model = en2de.eval() # disable dropout #print(model)
dummy_input1 = torch.ones(1, 4, device='cpu', dtype=torch.long) dummy_input2 = torch.ones(1, device='cpu', dtype=torch.long) dummy_input3 = torch.ones(5, 5, device='cpu', dtype=torch.long) inputs = (dummy_input1, dummy_input2, dummy_input3)
input_names = ["src_tokens", "src_lengths", "prev_output_tokens"] #output_names = ["start_logits","end_logits"] torch.onnx.export(model, inputs, "fairseq_transformer.onnx", verbose=True, input_names=input_names, output_names=["output"], export_params=True, opset_version=13, do_constant_folding = True, )#operator_export_type=torch.onnx.OperatorExportTypes.ONNX_ATEN_FALLBACK #checker onnx_file = onnx.load("fairseq_transformer.onnx") onnx.checker.check_model(onnx_file)`
The error sheetshot is :
This issue has been automatically marked as stale. If this issue is still affecting you, please leave any comment (for example, "bump"), and we'll keep it open. We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!
@rGitcy any comment?
@rGitcy 请问您解决这个问题了吗,我遇到了很您相似的问题。Have you solved this issue?
@rGitcy 请问您解决这个问题了吗,我遇到了很您相似的问题。Have you solved this issue?
没有,感觉官方repo有些问题
any progress?
This is much needed! Any progress?