ValueError: The checkpoint you are trying to load has model type `florence2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
System Info
Windows 10x64 pytorch version: 2.4.0+cu124 Python 3.11.8 transformers-4.46.0. dev0
Who can help?
No response
Information
- [X] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the
examplesfolder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below)
Reproduction
ComfyUI Error Report
Error Details
- Node Type: DownloadAndLoadFlorence2Model
- Exception Type: ValueError
-
Exception Message: The checkpoint you are trying to load has model type
florence2but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Stack Trace
File "C:\pinokio\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\pinokio\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\pinokio\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
process_inputs(input_dict, i)
File "C:\pinokio\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\pinokio\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 97, in loadmodel
model = AutoModelForCausalLM.from_pretrained(model_path, attn_implementation=attention, device_map=device, torch_dtype=dtype,trust_remote_code=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\pinokio\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\auto_factory.py", line 526, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\pinokio\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 1027, in from_pretrained
raise ValueError(
### Expected behavior
DownloadAndLoadFlorence2Model
The checkpoint you are trying to load has model type `florence2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
I meet the same issue on linux with transformers v4.45.1
Hmmm florence2 should be using remote code, and I see it's set to True in there.
@Rocketknight1, could you take a look at this please?
Hi @Romanio1997 @cct1018, can you tell me which checkpoint you're trying to load so I can investigate?
I checked microsoft/Florence-2-base and microsoft/Florence-2-large and could not reproduce the issue.
@Rocketknight1 try MiaoshouAI/Florence-2-base-PromptGen-v1.5
@andrewstoliarov I can reproduce the issue there - the cause is the model is misconfigured. Specifically, config.json is missing the auto_map key that is needed to load remote code correctly.
If you open a PR to that model repo to copy the auto_map key from a working florence2 model like microsoft/Florence-2-large, that should fix the issue!
Hi @Romanio1997 @cct1018, can you tell me which checkpoint you're trying to load so I can investigate?
I used this workflow:
https://openart.ai/workflows/myaiforce/svOEmbK8C4zyJsCYmqnx
@Romanio1997 I'm afraid we can't really debug third-party tools like that if we don't know which checkpoints they're using! My suspicion is that it's loading a florence2 checkpoint on the Hub somewhere that is misconfigured, and this is not a bug in transformers.
@Romanio1997 Here the question is about the checkpoint, not the workflow. In fact, you can use any other checkpoint from the list, as it won't significantly affect the result in this workflow.
@Romanio1997 I'm afraid we can't really debug third-party tools like that if we don't know which checkpoints they're using! My suspicion is that it's loading a
florence2checkpoint on the Hub somewhere that is misconfigured, and this is not a bug intransformers.
The model florence2 was downloaded automatically from huggingface.
Possible causes of the error: Unsupported architecture: The checkpoint refers to a model architecture that is not supported by the version of the Transformers library you have installed. In this case, it is the Florence2 architecture, which may not yet be integrated into the library.
Older version of Transformers: The error may occur because your version of the Transformers library is out of date and does not support newer architectures such as Florence2.
@Romanio1997 Here the question is about the checkpoint, not the workflow. In fact, you can use any other checkpoint from the list, as it won't significantly affect the result in this workflow.
I know. I just wanted to give you all the data related to the error.
I checked
microsoft/Florence-2-baseandmicrosoft/Florence-2-largeand could not reproduce the issue.
I changed the checkpoint from MiaoshouAI/Florence-2-base-PromptGen-v1.5 to microsoft/Florence-2-large, and I did't get the error again. But it still reports an error
Error loading model or tokenizer: Unrecognized configuration class <class 'transformers_modules.large.configuration_florence2.Florence2Config'> to build an AutoTokenizer.
I have a fresh, updated Comfyui installation, transformers updated, and Miaoshouai_Tagger custom node says:
Miaoshouai_Tagger
The checkpoint you are trying to load has model type `florence2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
I have a fresh, updated Comfyui installation, transformers updated, and Miaoshouai_Tagger custom node says:
Miaoshouai_Tagger The checkpoint you are trying to load has model type `florence2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Did you find a solution ?
In my case it was due to an incomplete download of model file (done while running the node), it lets you some files in models\LLM\Florence-2-base-PromptGen-v1.5.huggingface\download\ such as model..incomplete and its .lock => just uninstall node AND clear both models\LLM\Florence directories => reinstall, node, run it, watch the model file being downloaded, if everything went fine, the model has been moved (~3Gb and 1Gb).
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.