stable-diffusion-webui icon indicating copy to clipboard operation
stable-diffusion-webui copied to clipboard

Support for TPU

Open Ljzd-PRO opened this issue 2 years ago • 16 comments

Can it support Google TPU(like Google Colab)

Ljzd-PRO avatar Oct 18 '22 04:10 Ljzd-PRO

What do you mean? TPUs favor tensorflow, everything here is pytorch.

ClashSAN avatar Oct 18 '22 04:10 ClashSAN

I'm curious as well. If you get a tpu on Collab, then it's going to be slower than a rtx card of the same level?

giteeeeee avatar Oct 18 '22 09:10 giteeeeee

I'm curious as well. If you get a tpu on Collab, then it's going to be slower than a rtx card of the same level?

It’s said that TPU is faster when doing inferences, but not training.

Ljzd-PRO avatar Oct 18 '22 11:10 Ljzd-PRO

Just tested on a local RTX2060 6G vs. on a collab T4 12G

The 2060 appears to be ~25% faster when doing text2image and image2image.

Training I can't test with a 2060 lol

giteeeeee avatar Oct 19 '22 09:10 giteeeeee

TPU seems to be good at generating images in parralel. It would be very nice to have such compatibility.

Extraltodeus avatar Oct 21 '22 15:10 Extraltodeus

Is there any development on this?

iamianM avatar Nov 06 '22 19:11 iamianM

Diffusers officially supports TPU, so I'm guessing it's not a complete rehaul to add it. However, since it's FLAX, I'm not sure exactly how it would be done.

swcrazyfan avatar Nov 08 '22 02:11 swcrazyfan

There is project https://github.com/magicknight/stable-diffusion-tpu , however, it seems a bit abandoned

RarogCmex avatar Dec 06 '22 17:12 RarogCmex

I searched for some information, It seems to modify launch.py and webui.py

https://blog.richliu.com/2023/03/04/5109/stable-diffusion-webui-cpu-only-on-arm64-platform/ https://huggingface.co/docs/diffusers/using-diffusers/stable_diffusion_jax_how_to

upright2003 avatar Apr 16 '23 12:04 upright2003

Omg. webui android? (edit: nvm) Wonder if tpu inference working on the tensorchip on pixel 6..

ClashSAN avatar May 02 '23 06:05 ClashSAN

TPUs favor tensorflow, everything here is pytorch.

That's a misunderstanding. The "T" in TPU stands for "Tensor" not "TensorFlow". Both PyTorch and TensorFlow can use TPU under the hood. Look at https://colab.research.google.com/github/pytorch/xla/blob/master/contrib/colab/getting-started.ipynb

Except original SD, there is also Diffusers edition, which can work on TPU: https://huggingface.co/blog/stable_diffusion_jax

vsemecky avatar May 26 '23 16:05 vsemecky

I can get it to run on TPU VM but it's very slow.

aeroxy avatar Jun 01 '23 17:06 aeroxy

Can it support Google TPU(like Google Colab)

i looked into the source code it looks like it would take a massive effort to support TPU. First we need custom versions of torch, torch_xla, torchvision, and then we need to modify stable diffusion itself when calling torch APIs. TPU currently do not support all the APIs used in stable diffusion meaning we need to debug each single API.

aeroxy avatar Jun 13 '23 14:06 aeroxy

How would we even go about the memory, the coral TPU's don't even have one to begin with. However, it would be really cool if when there were support.

NXTler avatar Oct 02 '23 13:10 NXTler

I can get it to run on TPU VM but it's very slow. Can you share the code, how you would be able to get this running?

Was it slow because fo the low performance of the tpu or because the tpu wasn't use and the script runs on cpu?

i looked into the source code it looks like it would take a massive effort to support TPU. First we need custom versions of torch, >torch_xla, torchvision, and then we need to modify stable diffusion itself when calling torch APIs. TPU currently do not support >all the APIs used in stable diffusion meaning we need to debug each single API.

TPUs are currently the only way to give users usable access to tools like automatic1111, who are unable to upgrade a GPU. This applies to all laptops that do not have a dedicated GPU, for example. TPUs support would significantly increase the userbase.

Isn't it possible to use something like that with automatic1111: https://huggingface.co/blog/stable_diffusion_jax ??

B0rner avatar Oct 24 '23 09:10 B0rner

It's definitely possible. Here's an example of someone getting SDXL running on Google's TPU v5e https://huggingface.co/blog/sdxl_jax

ehamawy avatar Apr 29 '24 05:04 ehamawy