ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

Error: Could not detect model type when loading checkpoint

Open txhno opened this issue 1 year ago • 13 comments
trafficstars

Expected Behavior

The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node. ComfyUI

Actual Behavior

An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"

Steps to Reproduce

  1. Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
  2. Queue the workflow

Debug Logs

Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors
File "[path]/execution.py", line 152, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
File "[path]/execution.py", line 82, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "[path]/execution.py", line 75, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "[path]/nodes.py", line 518, in load_checkpoint
    out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config
    raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))

I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.

txhno avatar Aug 12 '24 11:08 txhno

same

cczw2010 avatar Aug 12 '24 14:08 cczw2010

What is the url of the model? And what is the size of the model?

ltdrdata avatar Aug 12 '24 15:08 ltdrdata

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

kane-le avatar Aug 12 '24 16:08 kane-le

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

That is not a checkpoint model. You can not load that model via Checkpoint Loader node. You have to use Load Diffusion Model.

image

ltdrdata avatar Aug 12 '24 16:08 ltdrdata

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

That is not a checkpoint model. You can not load that model via Checkpoint Loader node. You have to use Load Diffusion Model.

image

it works to me! thank you! ^0^ 图片 I was followed this guide, it confused me

kane-le avatar Aug 12 '24 17:08 kane-le

Same with me.

  1. macbook m1 max 32gb
  2. latest comfyui
  3. flux1-dev-fp8.sft and flux1-schnell-fp8.sft

The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?

[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb

Komorebi-Nine avatar Aug 13 '24 08:08 Komorebi-Nine

Same with me.

  1. macbook m1 max 32gb
  2. latest comfyui
  3. flux1-dev-fp8.sft and flux1-schnell-fp8.sft

The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?

[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb

fp8 17GB is checkpoint and can be loaded via Checkcpoint Loader. Other models should be loaded via Diffusion Loader.

ltdrdata avatar Aug 13 '24 09:08 ltdrdata

With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4

Or guff + dualclip loader Screenshot_1

dirgunchik2008 avatar Aug 21 '24 15:08 dirgunchik2008

The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.

taiczhi avatar Aug 27 '24 07:08 taiczhi

Expected Behavior

The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node. ComfyUI

Actual Behavior

An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"

Steps to Reproduce

  1. Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
  2. Queue the workflow

Debug Logs

Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors
File "[path]/execution.py", line 152, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
File "[path]/execution.py", line 82, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "[path]/execution.py", line 75, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "[path]/nodes.py", line 518, in load_checkpoint
    out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config
    raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))

I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.

The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.

taiczhi avatar Aug 27 '24 07:08 taiczhi

With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4

Or guff + dualclip loader Screenshot_1

That is not a regular checkpoint loader and that node only compatible with NF4 checkpoint.

ltdrdata avatar Aug 27 '24 10:08 ltdrdata

The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.

I completely agree but it is also pretty hard to guess the kind of checkpoint or model you had downloaded in the past .. my collection was multiple TBs when I completely starteted over and run into the same problem again. ;(

Freighter avatar Sep 25 '24 13:09 Freighter

fp8 can be only used on H100.

For v100 or A100s, which models should be used and replace the workflow which node?

lucasjinreal avatar Nov 10 '24 07:11 lucasjinreal

A common loader node for all model types would be useful, independently wether it's a checkpoint, a flux model, a flux nf4 model, a diffusion model or others. An output should then tell, which type is loaded, so the connected logic could then be switched to the correct handling nodes, as long there is no node, which can handle it all. (Clip loader, noise generator, sampler, CFG guider, Scheduler, Sampler and more..., the same for lora ...)

It's really hard, to create a node framework, that can handel all at once (SD1.5, SD2, SDXL, Flux, and others).

The point I mean is, that I wish to have only one basic template for any checkpoint/diffuser, so I only need to change the model, than to change the whole template, to be able to create direct comparisons in batch mode between different models of different types.

schoenid avatar Dec 19 '24 01:12 schoenid

Screenshot 2024-12-26 at 9 33 46 PM Hey I can't find how to set Load Diffusion Model. Is it deleted? If the UI has updated, what should I do? I need to run a flux model as this -

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

yiboliu avatar Dec 27 '24 02:12 yiboliu

Screenshot 2024-12-26 at 9 33 46 PM Hey I can't find how to set Load Diffusion Model. Is it deleted? If the UI has updated, what should I do? I need to run a flux model as this -

same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB

image You can find it from advanced > loaders

ltdrdata avatar Dec 27 '24 04:12 ltdrdata

I have a similar problem. I started to use ComfyUI today. I am using colab and I was trying to get a model from civitai. I tried lots of models but the error I have is again "ERROR: Could not detect model type of: /content/drive/MyDrive/ComfyUI/models/checkpoints/loras/Studio_Ghibli_Flux.safetensors" I tried making dir as loras and Flux.1D since base model is Flux.1D. I tried adding directly under checkpoints. I do not know if any other configurations needed. @ltdrdata I would appreciate if you can help me today

adayildizz avatar Feb 04 '25 08:02 adayildizz

I have a similar problem. I started to use ComfyUI today. I am using colab and I was trying to get a model from civitai. I tried lots of models but the error I have is again "ERROR: Could not detect model type of: /content/drive/MyDrive/ComfyUI/models/checkpoints/loras/Studio_Ghibli_Flux.safetensors" I tried making dir as loras and Flux.1D since base model is Flux.1D. I tried adding directly under checkpoints. I do not know if any other configurations needed. @ltdrdata I would appreciate if you can help me today

You should not download lora files to the checkpoints dir.

ltdrdata avatar Feb 04 '25 09:02 ltdrdata

I have a similar problem. I started to use ComfyUI today. I am using colab and I was trying to get a model from civitai. I tried lots of models but the error I have is again "ERROR: Could not detect model type of: /content/drive/MyDrive/ComfyUI/models/checkpoints/loras/Studio_Ghibli_Flux.safetensors" I tried making dir as loras and Flux.1D since base model is Flux.1D. I tried adding directly under checkpoints. I do not know if any other configurations needed. @ltdrdata I would appreciate if you can help me today

If you look to the directory /content/drive/MyDrive/ComfyUI/models/checkpoints, there should already be a file put_checkpoints_here. The same for the directory /content/drive/MyDrive/ComfyUI/models/loras, where you should find a file put_loras_here. Make sure, you have the CLIP models in /content/drive/MyDrive/ComfyUI/models/clip.

You could then load your FLUX checkpoint with the Checkpoint Loader (Simple), load your CLIP models with the DualCLIPLoader and load your Loras with the Power Lora Loader (in my picture below renamed to FLUX Lora Loader).

Example: Image

My Workflow is a bit to large, to add it completely as a picture. I've got a framework to load different model and lora types, but basically, the output of the Checkpoint Loader and of the DualClipLoader goes to the Lora Loader.

Image

I'm using SamplerCustomAdvanced to sample the images. This requires also the CFGGuider, the BasicScheduler, an Empty Latent Image node. Principally The MODEL output of the Lora Loader goes to the CFGGuider, The CLIP output trough prompt conditioners (positive and negative) also to the CFGGuider.

schoenid avatar Feb 05 '25 14:02 schoenid

thanks for all answers.

adayildizz avatar Feb 11 '25 11:02 adayildizz