can I trans model to onnx for online serving?
We don't have immediate plan for ONNX support. PR is welcomed.
looking forward to onnx converter for serving
I've tried converting to ONNX or TorchScript [to land it on Triton/TensorRT], quite a few issues. Seems that other people had issues with the ONNX export too - https://github.com/microsoft/onnxruntime/issues/12594
To add to the list of issues, https://github.com/pytorch/pytorch/issues/94280
convert model to onnx for online serving,have it been sovled?
when convert model (blip2_opt) to onnx ,I get onnx without any weight and it ‘s size show only 1kb.How can i fix it or anyone face this seem problem?