Marigold icon indicating copy to clipboard operation
Marigold copied to clipboard

Request: Need models in onnx and .pt format (not just .bin and .config)

Open 311-code opened this issue 1 year ago • 6 comments

I was requesting the model in another format because I cannot convert it without the proper model configuration file (I've tried) Need them in onnx or .pt format specifically for a Unity application called Depthviewer. Can we make this happen?

Here is a list of models and their formats available, as you can see depth-anything has onnx, I was hoping marigold could profile this also https://airtable.com/appjWiS91OlaXXtf0/shrchKmROzpsq0HFw/tblviBOLphAw5Befd

311-code avatar Feb 05 '24 05:02 311-code

I converted Marigold to ONNX and uploaded to HF in case it's useful to you: https://huggingface.co/julienkay/Marigold

julienkay avatar Mar 08 '24 10:03 julienkay

would love to have it in CoreML too!

alex-seville avatar Mar 30 '24 14:03 alex-seville

Please share scripts to automate these steps here, we will consider including these harness bits into the repository at a later stage.

toshas avatar Jun 11 '24 05:06 toshas

@julienkay I see the folder structure is different: the VAE encoder and decoder were moved to the top level. Is there a way to keep the original structure? If this is done intentionally, how is this onnx checkpoint used?

toshas avatar Jun 11 '24 05:06 toshas

Essentially I've just used the conversion script from diffusers.

Here is the code (probably not all packages required)

!pip install wheel wget
!pip install git+https://github.com/huggingface/diffusers.git
!pip install transformers onnxruntime onnx torch ftfy spacy scipy accelerate
!pip install onnxruntime-directml --force-reinstall
!pip install protobuf==3.20.2
!python -m wget https://raw.githubusercontent.com/huggingface/diffusers/main/scripts/convert_stable_diffusion_checkpoint_to_onnx.py -o convert_stable_diffusion_checkpoint_to_onnx.py
!mkdir model

!python convert_stable_diffusion_checkpoint_to_onnx.py --model_path="Bingxin/Marigold" --output_path="model/marigold_onnx"

julienkay avatar Jun 11 '24 08:06 julienkay

@julienkay I see the folder structure is different: the VAE encoder and decoder were moved to the top level. Is there a way to keep the original structure? If this is done intentionally, how is this onnx checkpoint used?

Afaik, separate encoder/decoder seems to be the "standard" way for onnx based pipelines in diffusers. Same as the OP my interest was mostly to use the onnx checkpoints in another inference framework (in this case Unity Sentis). I guess using them in python would require adding a separate onnx-specific pipeline like the ones found in diffusers/optimum for the most common pipelines like SD 1.5 / XL.

julienkay avatar Jun 11 '24 08:06 julienkay