Error loading model: 'JC_Models' object has no attribute 'model'
Please assist. Not working at all, even with this basic workflow. Using the HF-Version.
The error means the model didn’t finish loading correctly. Please update both ComfyUI and ComfyUI-JoyCaption to the latest version, then restart ComfyUI.
pip install -r requirements.txt
If it still fails, delete and redownload the model under models/LLM, and share your console log so we can check further.
I am seeing the same issue since a recent ComfyUI update. Been working fine for months.
The files are all present in the /models/LLM/xxx folder.
got prompt Loading FP8 model with automatic configuration... Prompt executed in 0.78 seconds
This is all I see in the console - regardless of which model I try to use.
So is this problem caused by the update of pytorch2.9.1+cu130?
I have just updated ComfyUI, and now the node is working again. Assuming this issue was actually Comfy and not the JC node.
I have just updated ComfyUI, and now the node is working again. Assuming this issue was actually Comfy and not the JC node.
Thank you for the good news.
I have just updated ComfyUI, and now the node is working again. Assuming this issue was actually Comfy and not the JC node.
Unfortunately, I performed the comfyui update. However, it still doesn't work properly.
After a series of attempts, I found out where the problem lay.
Now I can use it normally.
I think it's the "Add Custom Model" feature that has disrupted the loading process for users who previously downloaded models manually.
What I did was to delete the "llama-joycaption-beta-one" directory under /comfyui/models/llm. And rename the original manually downloaded "LLAMa-Joycaption-beta-one-HF-LLAVA" model directory to "llama-joycaption-beta-one ". Now I can use it normally again.