stable-diffusion-webui
stable-diffusion-webui copied to clipboard
How to generate 5000 images?
In UI there is only a "Batch count" with a range of 1-16.
In lstein repo in command line interface I could pass -n 5000 argument and 5000 images will be generated one by one.
Go into ui-config.json search for "txt2img/Batch count/maximum": and set it to your value
Restart of the UI is needed of course
this will hammer your gpu so hard, be careful with temps and make sure to have sufficient cooling and airflow
i wonder if there is a way to implement some sort of pause after x number of image generations so the gpu can have some downtime between converts, it's clear people are setting their pc to convert images and go to sleep or do other things so it might be useful for the health of the gpu long term
You can actually get 20-30% more performance by overclocking the GPU properly, for thousands of images (especially high res) that's quite a difference. Regarding temperature, 3080, 3090 cards have non cooled backplates and 50% of their memory chips mounted there. So they ramp up their fans (by firmware) to 100% but it does not affect the backplate so the front is cool (50-60 degree) and the back is hot (100-110 degrees C). It's a nvidia design flaw, maybe intentionally done given that every single 3rd party producer also does not cool the backplate.
I solved that by putting copper plates and a fan on the backplate, keeps temperatures below 98°C. I also underclock the GPU and overclock GPU memory to get those 20-30% increase in performance while lowering power consumption.
this will hammer your gpu so hard, be careful with temps and make sure to have sufficient cooling and airflow
i wonder if there is a way to implement some sort of pause after x number of image generations so the gpu can have some downtime between converts, it's clear people are setting their pc to convert images and go to sleep or do other things so it might be useful for the health of the gpu long term
Seconded!
Food for thought, would it be possible for the loop in the code check the GPU(s) temp and throttle back the requests until the temperature has dropped off some? So, a flag or non-default value for GPU temp to fall back and a range of time to wait for cooldown for? This would prevent the GPU from running in thermally triggered self-throttling mode.
Maybe also have the code determine how long it takes for the CPU to cool down and for the CPU to reach overhead (PID loop calibration?) to give a better estimate of time till completion?
One thing to consider when generating that many images in a batch, is that you can't see the images until the batch finishes, or you interrupt it.
#1683
this will hammer your gpu so hard, be careful with temps and make sure to have sufficient cooling and airflow
i wonder if there is a way to implement some sort of pause after x number of image generations so the gpu can have some downtime between converts, it's clear people are setting their pc to convert images and go to sleep or do other things so it might be useful for the health of the gpu long term
I wouldn't recommend this. The thermal cycle is what causes damage, not running hot per se. What you might want to do is power-limit the GPU; running at 50% power will probably only cause a 10-20% slowdown. Scheduled pauses just gives you the worst of both worlds.
So along those lines, does it make sense(more sense?) to just under clock the gpu so that it never hits those thermal limits?
I wouldn't recommend this. The thermal cycle is what causes damage, not running hot per se. What you might want to do is power-limit the GPU; running at 50% power will probably only cause a 10-20% slowdown. Scheduled pauses just gives you the worst of both worlds.
ideally you wouldn't need that many images with the same prompt anyway, i think the point of this software is to come up with new ideas and constantly innovate
admittedly i never had to put that much pressure on any of my computers in the past so my ideas are just that, ideas
One thing to consider when generating that many images in a batch, is that you can't see the images until the batch finishes, or you interrupt it.
#1683
I usually do a batch of 500-1000 and I view the images in the file structure as they are being generated. You get to control where they save, so just navigate to the folder and "voilà!" As well, you can watch the images as they are being generated if you turn on the option in settings.
In modules/shared.py there is a --max-batch-count command line argument that's defined, but the value from it is never used anywhere in the code (at least as far as I can tell).
My interpretation of the intention of this option is that you should be able to add, say, --max-batch-count=64 to your command line options, and then the UI would increase the slider limit, allowing you to create batches up to that number.
It seems like all that needs to happen is for the value to be incorporated into the appropriate places in models/ui.py where the sliders are created (do a search in that file for label='Batch count' and you find two locations).
If all of the above is correct, you'd just need to add a maximum parameter to the slider creation that used the value from the command line argument, something like this:
gr.Slider(minimum=1, maximum=cmd_opts.max_batch_count, step=1, label='Batch count', value=1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...however this has no effect :sweat_smile:
...though interestingly if you change the label parameter to something that's not in the internationalisation lookup table (like label='Betch count') it does work :man_shrugging:, so I don't know what's going on there.
Anyway, perhaps someone who knows what is going on here with the UI would be able to quickly fix this one up given the above information.
As it was stated in top comment - just set your preferred ui value and generate as much as you need.