Trying to convert to onnx
I'm trying to convert this model into onnx for gaze detection with the following command:
optimum-cli export onnx --model vikhyatk/moondream2 --trust-remote-code --task text-generation ./onnx_moondream
But it gives me:
ValueError: Trying to export a moondream1 model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type moondream1 to be supported natively in the ONNX export.
Could you guys please share some more info regarding this, how can I convert into ONNX?
Full log:
2025-02-09 21:46:43.761063: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2025-02-09 21:46:43.761093: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2025-02-09 21:46:43.761688: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2025-02-09 21:46:43.765101: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2025-02-09 21:46:44.296031: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Traceback (most recent call last):
File "/home/maifee/.local/bin/optimum-cli", line 8, in <module>
sys.exit(main())
File "/home/maifee/.local/lib/python3.10/site-packages/optimum/commands/optimum_cli.py", line 208, in main
service.run()
File "/home/maifee/.local/lib/python3.10/site-packages/optimum/commands/export/onnx.py", line 265, in run
main_export(
File "/home/maifee/.local/lib/python3.10/site-packages/optimum/exporters/onnx/__main__.py", line 375, in main_export
onnx_export_from_model(
File "/home/maifee/.local/lib/python3.10/site-packages/optimum/exporters/onnx/convert.py", line 1033, in onnx_export_from_model
raise ValueError(
ValueError: Trying to export a moondream1 model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type moondream1 to be supported natively in the ONNX export.
@maifeeulasad hi. may be it helps: you may use this code to unpack onnx from *.mf file (don't forget to put /clients/python/moondream/moonfile.py to same directory with this .py):
import os
import sys
from moonfile import unpack
if sys.argv[1] is not None and os.path.isfile(sys.argv[1]):
for filename, contents in unpack(sys.argv[1]):
with open(filename, 'wb') as fp:
fp.write(contents)
@betweenus I couldn't find any *.mf files. Could you please provide a bit more details. It would be really helpful.
@betweenus I couldn't find any
*.mffiles. Could you please provide a bit more details. It would be really helpful.
links to these files on main page of project: