stable-diffusion-webui icon indicating copy to clipboard operation
stable-diffusion-webui copied to clipboard

[Feature Request]: Controlling GPU/CPU Usage.

Open SeaN0X opened this issue 2 years ago • 7 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues and checked the recent builds/commits

What would your feature do ?

Some computers take a reasonable amount of time to train models or generate images, so it keeps processing for a long time, making it very hard to use the computer for other task, so i think that a slider controlling the usage percentage of the GPU/CPU on the main page, or maybe on the settings page would be very helpful to some users.

I understand that it will make the ui generate images or train models slower, but maybe we need to use our computer for something and we cannot pause it, so we just set the GPU/CPU usage to a slower percentage, or maybe you just want to test the power of your GPU/CPU in different percentages or don't want it to be 100% all the time.

Proposed workflow

1- The user sets a percentage on the scroller using the mouse or setting it directly on a text box on its side (sets the scroller value to it automatically). 2- Now the processing on the GPU or CPU should adjust itself to the value of the scroller

Additional information

If restarting the UI is required when the user changes the scroller bar value, it wouldn't be useful as a real time controlling one, but would still help if people need some processing from the GPU/CPU while it is generating a model or a image.

Example of the slider on the main menu: image

Example of the slider on settings: image

SeaN0X avatar Feb 08 '23 00:02 SeaN0X

if only your cpu is being used, You can set the number of cores to use. But I recommend onnx for txt2img generation instead, it is 2x faster you'd want to check this https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/3387 for setting your cpu cores, There was a PR for this as a flag

ClashSAN avatar Feb 08 '23 13:02 ClashSAN

and just to add, there is no way to set limits on gpu - its not up to webui, it would have to implemented in much lower levels such as pytorch.

vladmandic avatar Feb 12 '23 22:02 vladmandic

As sub-idea, easy to implement: Add settings parameter "Sleep in seconds" between image generations OR (AND) training. This doesn't allow to control CPU/GPU usage, but allow GPU time for cooldown. Of course, the price is generation time.

janvarev avatar Feb 14 '23 21:02 janvarev

As sub-idea, easy to implement: Add settings parameter "Sleep in seconds" between image generations OR (AND) training. This doesn't allow to control CPU/GPU usage, but allow GPU time for cooldown. Of course, the price is generation time.

if gpu needs a cooldown to start with, i'd reduce the clocks and/or vcore. overclocking is good, but not to the point where it results in problems. i'd rather run my gpu at its stable limit for 24h/day than have it burst just to need to slowdown.

vladmandic avatar Feb 14 '23 21:02 vladmandic

shared.zip I've implemented it partially. Sorry because of no PR, hope someone else release it.

How to use:

  1. Unpack attached ZIP to modules/shared.py, replacing it.
  2. By default, nothing changes!!
  3. If you want to sleep between jobs in batch, use cmdline args like --sleep-time-between-jobs=20.0 (for 20 seconds). You can place it in webui-user.bat

This was implemented by two changes in shared.py:

In begginning of file:

parser.add_argument("--sleep-time-between-jobs", type=float, help="Set sleep time between two jobs in batch", default=0.0)

In middle:

def nextjob(self):
        if opts.live_previews_enable and opts.show_progress_every_n_steps == -1:
            self.do_set_current_image()

        import time
        sltime = cmd_opts.sleep_time_between_jobs
        if sltime > 0.0:
            print(f"Sleep for {sltime} seconds between jobs (to cooldown)...")
            time.sleep(sltime)
            print(f"Awakening...")

        self.job_no += 1
        self.sampling_step = 0
        self.current_image_sampling_step = 0

janvarev avatar Feb 17 '23 11:02 janvarev

As sub-idea, easy to implement: Add settings parameter "Sleep in seconds" between image generations OR (AND) training. This doesn't allow to control CPU/GPU usage, but allow GPU time for cooldown. Of course, the price is generation time.

if gpu needs a cooldown to start with, i'd reduce the clocks and/or vcore. overclocking is good, but not to the point where it results in problems. i'd rather run my gpu at its stable limit for 24h/day than have it burst just to need to slowdown.

Unfortunately, I don't have enough options to manage my notebook GPU - it's still warm up as he wants. I wanna keep fans on low... so, I plan just a little bit cooldown after job in batch (GPU warms only if it process 10+ jobs consequantely).

janvarev avatar Feb 17 '23 11:02 janvarev

Oh, the other way - if you use DDIM, and wanna make a break even between steps.

  1. Add to start of shared.py: parser.add_argument("--sleep-time-ddim", type=float, help="Set sleep time in ddim", default=0.0)

  2. Add to repositories/stable-diffusion-stability-ai/ldm/models/diffusion/ddim.py near row 174

            if index % log_every_t == 0 or index == total_steps - 1:
                intermediates['x_inter'].append(img)
                intermediates['pred_x0'].append(pred_x0)

# add here
            import time
            from modules.shared import cmd_opts
            if cmd_opts.sleep_time_ddim > 0.0:
                time.sleep(cmd_opts.sleep_time_ddim)
  1. Now you can control cooldown between steps with param --sleep-time-ddim=2.0 (in seconds).

janvarev avatar Feb 17 '23 12:02 janvarev

As sub-idea, easy to implement: Add settings parameter "Sleep in seconds" between image generations OR (AND) training. This doesn't allow to control CPU/GPU usage, but allow GPU time for cooldown. Of course, the price is generation time.

TBH, enabling to set a small delay between steps could help give room for other aplications to catch up so it's not a bad idea. If you have a set delay option and enable delay it would be easy to just add the bool togle as an quick option

alfonslm avatar Mar 08 '24 10:03 alfonslm