regisss
regisss
As for opening an issue for implementing all available ONNX models in the ORTConfigManager, why not. It could be a good way to track which models still have to be...
> > As for opening an issue for implementing all available ONNX models in the ORTConfigManager, why not. It could be a good way to track which models still have...
> @regisss oh sorry, I misunderstood it. I think that is a good idea! Both users and contributors could have a better idea about what has been integrated, what needs...
@yufenglee @chilo-ms any feedback on this PR?
@shaked571 Could you share the command/script you used to export the model please?
> hi , is optimum supports converting Llama (alpaca-lora) to onnx ? It would be great if i get some insights in this Yes, this is supported and was introduced...
@aresa7796 As discussed with @hieupth in #1177, your file `/mnt/public/demo/m2m_100_demo/onnx.py` is called `onnx.py` which messes up with the import of the `onnx` package. Could you rename it and let me...
There is a lot of overlap with #918, let's wait for it to be merged.
@jiminha Can you rebase on main and fix the merge conflicts?
Please share your script and the command to run it here, that makes investigation much easier and you'll get a solution much faster. My best guess without this information is...