"ValueError: Trying to export a codesage model" while trying to export codesage/codesage-large
System Info
optimum 1.23.2
MacOS 14.7
Python 3.9
Who can help?
@michaelbenayoun
Information
- [ ] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - [X] My own task or dataset (give details below)
Reproduction (minimal, reproducible, runnable)
This is a PyTorch embedding model released by AWS, as described here: https://www.linkedin.com/posts/changsha-ma-9ba7a485_yes-code-needs-its-own-embedding-models-activity-7163196644258226176-bFSW
Hoping I can use it with RAG under ollama for code understanding.
huggingface-cli download codesage/codesage-large
optimum-cli export onnx --model codesage/codesage-large codesage-large-onnx --task default --trust-remote-code
The error: "ValueError: Trying to export a codesage model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as custom_onnx_configs. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type codesage to be supported natively in the ONNX export."
I am grateful for any help you can provide!
Expected behavior
An exported ONNX file.