sd-dynamic-prompts icon indicating copy to clipboard operation
sd-dynamic-prompts copied to clipboard

Immediate evaluation of variables does not reset between prompts

Open tscott65 opened this issue 1 year ago • 7 comments

I have the following prompt (and corresponding wildcard files for f_sentient and f_animal):

${animal=!{__f_animal__}} ${sentient=!{__f_sentient__}} [${animal}|${sentient}|${animal}] AND ${sentient} in an ancient forest. Dramatic lighting. dappled lighting. deeps shadows.

Every time I run with a "batch count" > 1, each resulting image uses exactly the same wildcard values. The same is true if I set batch-size > 1 and batch_count = 1. The values seem to repeat no matter what.

However, if I hit "Generate" again, the values are different.

Dynamic Prompts version 45b21373 | Sat Jun 3 10:48:47 2023

version: v1.3.2  •  python: 3.10.7  •  torch: 2.0.1+cu118  •  xformers: N/A  •  gradio: 3.32.0  •  checkpoint: 914077285245b21373 Sat Jun 3 10:48:47 2023

tscott65 avatar Jun 08 '23 23:06 tscott65

@tscott65 Instead of using batch count and batch size, under dynamic prompts, enable "Combinatorial generation" and leverage "Max generations" and "Combinatorial batches".

You may still have issues, A few of us are having issues using Immediate evaluation of variables like you are: https://github.com/adieyal/sd-dynamic-prompts/issues/421

layluke avatar Jun 13 '23 17:06 layluke

I encountered the same problem after update today. And the "batch count" would automatically be 64 (after rebooting it becomes 16), regardless the number setted. Reinstalling dosen't help.

amurorei01 avatar Aug 13 '23 04:08 amurorei01

@akx what are your thoughts on this? It isn't a bug per se in that the sampler is designed to fix the value of the variable once it has been evaluated but the expectation of randomised values within a batch is reasonable. This problem would be resolved if we created a new sampler for every batch but then other session-based state like cyclical samplers wouldn't work. Perhaps we should reset variables each time we sample?

adieyal avatar Aug 13 '23 07:08 adieyal

same bug for me

AIrtistry avatar Aug 13 '23 14:08 AIrtistry

@adieyal Hm, don't we reset (or at least overwrite) them in the context for each generation? 🤔

akx avatar Aug 14 '23 15:08 akx

I'm not 100% sure. The variable assignment is takes place in

https://github.com/adieyal/dynamicprompts/blob/main/src/dynamicprompts/samplers/base.py#L71-L75

    def _get_sequence(
        self,
        command: SequenceCommand,
        context: SamplingContext,
    ) -> StringGen:
        tokens, context = context.process_variable_assignments(command.tokens)
        sub_generators = [context.generator_from_command(c) for c in tokens]


        while True:
            yield rotate_and_join(sub_generators, separator=command.separator)

the variables are processed outside of the loop on line 71 but the generators are sampled inside of the loop and yielded to the caller. From my reading, I think that variables are only ever processed once.

Here is a unit test that fails

def test_random_variables_across_prompts(wildcard_manager: WildcardManager):
    cmd = parse("${ball=!{red|blue}} ${ball} ${ball} ${ball}")
    scon = SamplingContext(
        default_sampling_method=SamplingMethod.RANDOM,
        wildcard_manager=wildcard_manager
    )
    gen = scon.sample_prompts(cmd)
    seen = set(islice(gen, 40))
    assert len(seen) == 2

This change lets the test pass

    def _get_sequence(
        self,
        command: SequenceCommand,
        context: SamplingContext,
    ) -> StringGen:

        while True:
            tokens, context = context.process_variable_assignments(command.tokens)
            sub_generators = [context.generator_from_command(c) for c in tokens]
            yield rotate_and_join(sub_generators, separator=command.separator)

but breaks a lot of other stuff

adieyal avatar Aug 14 '23 21:08 adieyal

Not sure if I understand this topic correctly but I'm wondering whether #668 is related?

anothertal3 avatar Nov 10 '23 10:11 anothertal3