ComfyUI
ComfyUI copied to clipboard
Error: Could not detect model type when loading checkpoint
Expected Behavior
The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node.
Actual Behavior
An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"
Steps to Reproduce
- Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
- Queue the workflow
Debug Logs
Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors
File "[path]/execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "[path]/execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "[path]/execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "[path]/nodes.py", line 518, in load_checkpoint
out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config
raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))
I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.
same
What is the url of the model? And what is the size of the model?
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
That is not a checkpoint model. You can not load that model via Checkpoint Loader node.
You have to use Load Diffusion Model.
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
That is not a checkpoint model. You can not load that model via
Checkpoint Loadernode. You have to useLoad Diffusion Model.
it works to me! thank you! ^0^
I was followed this guide, it confused me
Same with me.
- macbook m1 max 32gb
- latest comfyui
- flux1-dev-fp8.sft and flux1-schnell-fp8.sft
The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?
[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb
Same with me.
- macbook m1 max 32gb
- latest comfyui
- flux1-dev-fp8.sft and flux1-schnell-fp8.sft
The full version flux1[^1] should load by diffusion loader, but the lite version flux1[^2] should load by checkpoint loader, right?
[^1]: fp16 size ~ 23gb [^2]: fp8 size ~ 12gb
fp8 17GB is checkpoint and can be loaded via Checkcpoint Loader. Other models should be loaded via Diffusion Loader.
With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
Or guff + dualclip loader
The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.
Expected Behavior
The checkpoint "flux1-dev-fp8.safetensors" should load successfully using the CheckpointLoaderSimple node.
Actual Behavior
An error occurs when attempting to load the checkpoint, stating "ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors"
Steps to Reproduce
- Attempt to load the checkpoint "flux1-dev-fp8.safetensors" using CheckpointLoaderSimple node
- Queue the workflow
Debug Logs
Error occurred when executing CheckpointLoaderSimple: ERROR: Could not detect model type of: [path]/models/checkpoints/flux1-dev-fp8.safetensors File "[path]/execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "[path]/execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "[path]/execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "[path]/nodes.py", line 518, in load_checkpoint out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings")) File "[path]/comfy/sd.py", line 513, in load_checkpoint_guess_config raise RuntimeError("ERROR: Could not detect model type of: {}".format(ckpt_path))I'm on the newest version of ComfyUI, I know this because I cloned master a few mins back and copied the models and custom_nodes dirs and tried it again. Also ComfyUI Manager says I'm up to date.
The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.
With the regular Checkcpoint Loader, I got this error. Only with NF4 did it work. https://github.com/comfyanonymous/ComfyUI_bitsandbytes_NF4
Or guff + dualclip loader
That is not a regular checkpoint loader and that node only compatible with NF4 checkpoint.
The accurate FP8 checkpoint model available for loading can be accessed at: https://huggingface.co/Comfy-Org/flux1-dev/blob/main/flux1-dev-fp8.safetensors. There appears to have been some misunderstanding, leading many to use incorrect models; this link directs you to the precise resource.
I completely agree but it is also pretty hard to guess the kind of checkpoint or model you had downloaded in the past .. my collection was multiple TBs when I completely starteted over and run into the same problem again. ;(
fp8 can be only used on H100.
For v100 or A100s, which models should be used and replace the workflow which node?
A common loader node for all model types would be useful, independently wether it's a checkpoint, a flux model, a flux nf4 model, a diffusion model or others. An output should then tell, which type is loaded, so the connected logic could then be switched to the correct handling nodes, as long there is no node, which can handle it all. (Clip loader, noise generator, sampler, CFG guider, Scheduler, Sampler and more..., the same for lora ...)
It's really hard, to create a node framework, that can handel all at once (SD1.5, SD2, SDXL, Flux, and others).
The point I mean is, that I wish to have only one basic template for any checkpoint/diffuser, so I only need to change the model, than to change the whole template, to be able to create direct comparisons in batch mode between different models of different types.
Hey I can't find how to set Load Diffusion Model. Is it deleted? If the UI has updated, what should I do? I need to run a flux model as this -
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
Hey I can't find how to set Load Diffusion Model. Is it deleted? If the UI has updated, what should I do? I need to run a flux model as this -
same question and my model url is https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/flux1-dev.safetensors 23.8GB
You can find it from
advanced > loaders
I have a similar problem. I started to use ComfyUI today. I am using colab and I was trying to get a model from civitai. I tried lots of models but the error I have is again "ERROR: Could not detect model type of: /content/drive/MyDrive/ComfyUI/models/checkpoints/loras/Studio_Ghibli_Flux.safetensors" I tried making dir as loras and Flux.1D since base model is Flux.1D. I tried adding directly under checkpoints. I do not know if any other configurations needed. @ltdrdata I would appreciate if you can help me today
I have a similar problem. I started to use ComfyUI today. I am using colab and I was trying to get a model from civitai. I tried lots of models but the error I have is again "ERROR: Could not detect model type of: /content/drive/MyDrive/ComfyUI/models/checkpoints/loras/Studio_Ghibli_Flux.safetensors" I tried making dir as loras and Flux.1D since base model is Flux.1D. I tried adding directly under checkpoints. I do not know if any other configurations needed. @ltdrdata I would appreciate if you can help me today
You should not download lora files to the checkpoints dir.
I have a similar problem. I started to use ComfyUI today. I am using colab and I was trying to get a model from civitai. I tried lots of models but the error I have is again "ERROR: Could not detect model type of: /content/drive/MyDrive/ComfyUI/models/checkpoints/loras/Studio_Ghibli_Flux.safetensors" I tried making dir as loras and Flux.1D since base model is Flux.1D. I tried adding directly under checkpoints. I do not know if any other configurations needed. @ltdrdata I would appreciate if you can help me today
If you look to the directory /content/drive/MyDrive/ComfyUI/models/checkpoints, there should already be a file put_checkpoints_here. The same for the directory /content/drive/MyDrive/ComfyUI/models/loras, where you should find a file put_loras_here. Make sure, you have the CLIP models in /content/drive/MyDrive/ComfyUI/models/clip.
You could then load your FLUX checkpoint with the Checkpoint Loader (Simple), load your CLIP models with the DualCLIPLoader and load your Loras with the Power Lora Loader (in my picture below renamed to FLUX Lora Loader).
Example:
My Workflow is a bit to large, to add it completely as a picture. I've got a framework to load different model and lora types, but basically, the output of the Checkpoint Loader and of the DualClipLoader goes to the Lora Loader.
I'm using SamplerCustomAdvanced to sample the images. This requires also the CFGGuider, the BasicScheduler, an Empty Latent Image node. Principally The MODEL output of the Lora Loader goes to the CFGGuider, The CLIP output trough prompt conditioners (positive and negative) also to the CFGGuider.
thanks for all answers.


Hey I can't find how to set Load Diffusion Model. Is it deleted? If the UI has updated, what should I do? I need to run a flux model as this -