ComfyUI_Comfyroll_CustomNodes
ComfyUI_Comfyroll_CustomNodes copied to clipboard
fixed cr load lora loaded_lora deletion
Deleting self.loaded_lora effectively remove the member from the class.
Not an issue in most cases as the following lines will reattribute it with the new loaded LoRA.
But if self.load_lora wasn't None, different from lora_path and lora_path is an invalid path, then the deletion happen but no new value is attributed to self.load_lora as comfy.utils.load_torch_file() fails and crash.
Then at the next execution, since self.loaded_lora does not exists, the check for self.load_lora is not None crash.
What it means in practice is that loading a valid lora, then trying to load an incorrect one will lock the node in a invalid state until the node is reinitialized (deleted/recreated) or if the UI is restarted.
The solution is to replace the deletion with an attribution to None.