ComfyUI icon indicating copy to clipboard operation
ComfyUI copied to clipboard

The `widgets_values` array no longer includes INPUT_TYPES that are defaultInput/forceInput thus breaking all indexing done on any node updating `widgets_values` during "Run"?

Open phazei opened this issue 8 months ago • 4 comments

Expected Behavior

I have a node I made almost 6 months ago. It's been working mostly fine other than a ComfyUI update with the combo boxes I fixed a couple months ago.

I updated ComfyUI the other day and I see a new issue. I believe it's either a bug in ComfyUI, or a big change in how ComfyUI handles data from nodes during execution and workflow saving.

Image

When I use my node, it works fine, but when I load a workflow from an image generated from my node, the values in the boxes get switched around. The value gets loaded in a box that shouldn't have any input for anything unless someone is specifically typing a name to save their prompt.

This is a screenshot right after generating a picture: I circled in blue where the data is placed once it's generated from the LLM, where it's supposed to be. Image

And here is a screenshot after loading the workflow: I circled in blue where the data ends up: Image

In my node I have this code:

def process(self, use_input_text=False, text="", prompt_text="", save_as_key="", load_saved="None", prompt_lists="default", unique_id=None, extra_pnginfo=None):
        # Update the prompt text based on use_input_text toggle
        output_text = prompt_text
        if use_input_text and text is not None:
            output_text = text
            # Send update to frontend to update prompt widget
            PromptServer.instance.send_sync("prompt-stash-update-prompt", {
                "node_id": unique_id,
                "prompt": text
            })

            # Handle both list and dict formats of extra_pnginfo
            workflow = None
            if isinstance(extra_pnginfo, list) and len(extra_pnginfo) > 0:
                workflow = extra_pnginfo[0].get("workflow")
            elif isinstance(extra_pnginfo, dict):
                workflow = extra_pnginfo.get("workflow")

            if workflow:
                node = next(
                    (x for x in workflow["nodes"] if str(x["id"]) == str(unique_id)),
                    None
                )
                if node and "widgets_values" in node:
                    # Set use_input_text to False in metadata (index 0 based on INPUT_TYPES order)
                    use_input_text_index = 0  # First widget in optional inputs
                    prompt_text_index = 2     # Third widget in optional inputs

                    # Update the values in metadata
                    node["widgets_values"][use_input_text_index] = False  # Force use_input_text to False in metadata
                    node["widgets_values"][prompt_text_index] = output_text  # Update the prompt text
        
        return (output_text,)

I had to add that if workflow section because when the data is saved in the meta, it's necessary to update the output_text because otherwise since it was generated during execution, it wouldn't otherwise be added to the workflow. Additionally I need to set the use_input_text_index to false so when the node is loaded, it uses the saved prompt and doesn't try to load a new one. That way the meta data saved in the image properly regenerates the same image. And as you can see in the screenshots, the "Use Input" changes to "Use Prompt" when loading the workflow. So that's working properly.

Actual Behavior

.

Steps to Reproduce

.

Debug Logs

.

Other

No response

phazei avatar Apr 24 '25 09:04 phazei

The last screenshots were taken when I first loaded the saved workflow, fresh start.

If I've already generated an image then this similar but slightly different behavior happens:

Here's a screenshot of just the node from the screen right after generating the image: Image

I'd like to note that the node actually clears out the value inside the "Save name" box the moment the text in the prompt box changes.

Now, this is a screenshot of the node loaded from the image metadata:

Image

As you can see in the first image, it's a new prompt. But after I load it from the image generated, you can see the beginning of the new prompt in the "Save Name" box, but the main prompt window has the value from the image before it from my first post here. As you can see it was totally erased before generating the new image, but then it came back and somehow in the box I'd have expected it before.

Edit: Actually, what's being saved in the prompt box is whatever I typed in it before I hit generate and had the LLM load in the value during execution. I tried typing "MONKEY" and that was replaced with the LLM text, and my process workflow code doesn't overwrite it and saves the value to the "Save Prompt" box. So clearly the index must have changed, but my code hasn't.

So the way ComfyUI does indexing either changed or is bugged.

My items come from here: https://github.com/phazei/ConfyUI-node-prompt-stash-saver/blob/main/prompt_stash_saver_node.py#L19

"optional": {
                "use_input_text": ("BOOLEAN", {"default": False, "label_on": "Use Input", "label_off": "Use Prompt"}),
                "text": ("STRING", {"default": "", "defaultInput": True, "tooltip": "Optional input text", "lazy": True}),
                "prompt_text": ("STRING", {"multiline": True, "default": "", "placeholder": "Enter prompt text"}),
                "save_as_key": ("STRING", {"default": "", "placeholder": "Enter key to save as"}),
                "load_saved": ("COMBO", {"default": "None"}), # Will be populated with actual prompts
                "prompt_lists": ("COMBO", {"default": "default"}), # Will be populated with actual lists
            },

And use_input_text is still 1 since that's working, but the index for prompt_text, which used to be 3, has somehow changed?

phazei avatar Apr 24 '25 09:04 phazei

The problem is most likely related to what's in: https://github.com/phazei/ConfyUI-node-prompt-stash-saver/tree/main/js

This sounds like a frontend issue that should be reported on: https://github.com/Comfy-Org/ComfyUI_frontend

comfyanonymous avatar Apr 24 '25 20:04 comfyanonymous

I reported it there.

So, I wrote in the text field "THIS VALUE SHOULD BE REPLACED ON RUN/EXECUTE". That's the box whose text is replaced with the input text on execution/run.

Then I grabbed the JSON saved in the image, and I see this:

            "widgets_values": [
                false,
                "\nTHIS VALUE SHOULD BE REPLACED ON RUN/EXECUTE",
                "\nhappy 30-year-old couple, sunny park with green grass and trees, bright sunlit atmosphere, (frisbee toss:1.3), (laughter:1.5), (sporty attire:1.2)",
                "None",
                "default",
                null,
                null
            ]

I don't know if that's generated on the front end or back end.

yet in the python I have:

prompt_text_index = 2
node["widgets_values"][prompt_text_index] 

yet it's ending up in the wrong index.

But as these are my inputs:

"optional": {
                "use_input_text": ("BOOLEAN", {"default": False, "label_on": "Use Input", "label_off": "Use Prompt"}),
                "text": ("STRING", {"default": "", "defaultInput": True, "tooltip": "Optional input text", "lazy": True}),
                "prompt_text": ("STRING", {"multiline": True, "default": "", "placeholder": "Enter prompt text"}),
                "save_as_key": ("STRING", {"default": "", "placeholder": "Enter key to save as"}),
                "load_saved": ("COMBO", {"default": "None"}), # Will be populated with actual prompts
                "prompt_lists": ("COMBO", {"default": "default"}), # Will be populated with actual lists
            },

So, how is the second index being set I'm setting my index to the third one.

phazei avatar Apr 25 '25 02:04 phazei

So I added a log

if node and "widgets_values" in node:
    print("Debug - widgets_values:", node["widgets_values"])

output:

Debug - widgets_values: [True, '\nTHIS VALUE SHOULD BE REPLACED ON RUN/EXECUTE', '', 'None', 'default', None, None]

So it looks like you're not including everything in the list of INPUT_TYPES anymore? Are you excluding anything that is a defaultInput now? Is that intentional or a bug? Since it's being provided in the python, is it a frontend and backend thing that was a coordinated upgrade? It's going to break any saved workflow anyone has for any node that needs to save on execute/run and relies on the index key order.

phazei avatar Apr 25 '25 03:04 phazei