error in comfyui when i try to load lora
Error occurred when executing DiffusersPipelineLoader:
Incorrect path_or_model_id: 'I:\HunyuanDiT\ComfyUI\custom_nodes\comfyui-hydit....\models\loras\last.ckpt'. Please provide either the path to a local folder or the repo_id of a model on the Hub.
File "I:\HunyuanDiT\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\custom_nodes\comfyui-hydit\nodes.py", line 109, in create_pipeline gen = End2End(args_hunyuan[0], Path(os.path.join(HUNYUAN_PATH, pipeline_folder_name)), LOAR_PATH=LORA_PATH) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\custom_nodes\comfyui-hydit\hydit\inference_comfyui.py", line 241, in init self.model.load_adapter(LOAR_PATH) File "I:\ANACONDA3\envs\HUNYUANDIT\Lib\site-packages\transformers\integrations\peft.py", line 176, in load_adapter adapter_config_file = find_adapter_config_file( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\ANACONDA3\envs\HUNYUANDIT\Lib\site-packages\transformers\utils\peft_utils.py", line 88, in find_adapter_config_file adapter_cached_filename = cached_file( ^^^^^^^^^^^^ File "I:\ANACONDA3\envs\HUNYUANDIT\Lib\site-packages\transformers\utils\hub.py", line 463, in cached_file raise EnvironmentError(
I made lora in Kohyass but I can’t run it in comfyui
should I create an adapter_config.json file for my lora? how to do it?
Do you solve it?
Do you solve it?
no
the released new version could load lora from khoyo, today.
new error, can you tell me what it could be.
Error occurred when executing CLIPTextEncode:
'tuple' object has no attribute 'pop'
File "I:\HunyuanDiT\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\HunyuanDiT\ComfyUI\nodes.py", line 59, in encode cond = output.pop("cond") ^^^^^^^^^^
I did this - the error disappeared git reset --hard 90389b3b8a69c08c3ed0bcc9d87a92246578a8e3
but in the console there are errors as if the lora is not working
a key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_query.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_query.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_query.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_value.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_value.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_value.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_intermediate_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_intermediate_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_intermediate_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_output_dense.lora_up.weight
etc...
have you replace the lora.py following readme
On Wed, Jul 17, 2024 at 12:21 AM nikolaiusa @.***> wrote:
but in the console there are errors as if the lora is not working
a key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_query.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_query.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_query.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_value.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_value.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_value.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_intermediate_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_intermediate_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_intermediate_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_output_dense.lora_up.weight
etc...
— Reply to this email directly, view it on GitHub https://github.com/Tencent/HunyuanDiT/issues/146#issuecomment-2231339759, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACZHK7OS5TRLP3UDTTARWBDZMVCBFAVCNFSM6AAAAABKNGKSNOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMZRGMZTSNZVHE . You are receiving this because you commented.Message ID: @.***>
have you replace the lora.py following readme
thx i replaced.
after this comman:
python custom_nodes/comfyui-hydit/convert_hunyuan_to_comfyui_lora.py
--lora_path ${HunyuanDiT}/log_EXP/001-lora_porcelain_ema_rank64/checkpoints/0000100.pt/adapter_model.safetensors
--save_lora_path ${ComfyUI}/models/loras/adapter_model_convert.safetensors
this error in console:
a key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.lora_down.weight
Without this command " convert_hunyuan_to_comfyui_lora.py " the lora works - no error in console.
but with strange resultst )
the lora you trained is using khoyos, don’t need this convert script.
On Wed, Jul 17, 2024 at 5:34 PM nikolaiusa @.***> wrote:
have you replace the lora.py following readme
thx i replaced.
after this comman: python custom_nodes/comfyui-hydit/convert_hunyuan_to_comfyui_lora.py --lora_path ${HunyuanDiT}/log_EXP/001-lora_porcelain_ema_rank64/checkpoints/ 0000100.pt/adapter_model.safetensors --save_lora_path ${ComfyUI}/models/loras/adapter_model_convert.safetensors
this error in console:
a key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_attention_self_value.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_intermediate_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_8_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_down.weight lora key not loaded: lora_te1_encoder_layer_9_attention_output_dense.lora_up.weight lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.alpha lora key not loaded: lora_te1_encoder_layer_9_attention_self_key.lora_down.weight
Without this command " convert_hunyuan_to_comfyui_lora.py " the lora works
- no error in console.
but with strange resultst )
— Reply to this email directly, view it on GitHub https://github.com/Tencent/HunyuanDiT/issues/146#issuecomment-2232868103, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACZHK7NI4GJ4FH7Y3R5OLU3ZMY3CJAVCNFSM6AAAAABKNGKSNOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMZSHA3DQMJQGM . You are receiving this because you commented.Message ID: @.***>