stable-diffusion icon indicating copy to clipboard operation
stable-diffusion copied to clipboard

Difference between n_samples and n_iter?

Open Neosettler opened this issue 3 years ago • 9 comments

greetings, would anyone care to explain what is the difference between n_samples and n_iter? they both make n output images, it's a bit confusing.

Thank you,

Neosettler avatar Sep 07 '22 15:09 Neosettler

  --n_samples N_SAMPLES
                        how many samples to produce for each given prompt. A.k.a batch size

This means how many images will be generated.

--n_iter N_ITER sample this often

number of denoising iterations for each image.

RiccardoRiglietti avatar Sep 07 '22 15:09 RiccardoRiglietti

thank you Riccardo, n_iter generates an image for each iteration, is that normal?

Neosettler avatar Sep 07 '22 17:09 Neosettler

n_iter generates n_samples of images for each iteration

BeaverInGreenland avatar Sep 08 '22 10:09 BeaverInGreenland

Thank you for your input Beaver,

What happens in the background for n_iter? does it re-run the exact same parameters, same seed, in hope for a variation? What is the difference for each n_iter?

The same question could be asked for n_samples, if this is not too much to ask.

Note: maybe someone could point out where to look in the code base?

Thank you,

Neosettler avatar Sep 08 '22 15:09 Neosettler

--n_samples A.k.a batch size is literally a batch size, increasing it you will try to fit more and more data to your GPU for a single run, increasing n_iter on the other hand just increases number of batches (each of size --n_samples) being processed sequentially. Seed is different for each generated image by default, you can use fixed seed by passing --fixed_code parameter to the txt2img.py script (seed will be fixed for different iterations the number of which is controlled by the n_iter parameter, images in the batch will have different seeds, so if you want to generate n images using a single seed set --n_samples to 1 and n_iter to n).

KostyaAtarik avatar Sep 09 '22 08:09 KostyaAtarik

Thank you for your explanation Kostya.

Using --fixed_code, if the seed is the same for each images, what other parameters are being randomized, if any?

Neosettler avatar Sep 11 '22 00:09 Neosettler

--n_samples A.k.a batch size is literally a batch size, increasing it you will try to fit more and more data to your GPU for a single run, increasing n_iter on the other hand just increases number of batches (each of size --n_samples) being processed sequentially.

Does this mean that higher --n_samples can potentially produce better result at the expense of GPU memory?

ironharvy avatar Feb 10 '23 20:02 ironharvy

No, it can't. I think here you confuse training phase (when changing batch size actually can drastically affect resulting model quality) with inference phase (when changing batch size only affects speed and memory consumption, whereas the generation result is determined by the fixed weights of the trained model + some randomness in case of stochastic sampler usage).

KostyaAtarik avatar Feb 13 '23 12:02 KostyaAtarik

--n_samples A.k.a batch size is literally a batch size, increasing it you will try to fit more and more data to your GPU for a single run, increasing n_iter on the other hand just increases number of batches (each of size --n_samples) being processed sequentially. Seed is different for each generated image by default, you can use fixed seed by passing --fixed_code parameter to the txt2img.py script (seed will be fixed for different iterations the number of which is controlled by the n_iter parameter, images in the batch will have different seeds, so if you want to generate n images using a single seed set --n_samples to 1 and n_iter to n).

How can we know the seed for each images in a batch. I notice that is not a simple +1, +2

LemonCoding avatar Mar 04 '23 17:03 LemonCoding