stable-diffusion-webui
stable-diffusion-webui copied to clipboard
Support for TPU
Can it support Google TPU(like Google Colab)
What do you mean? TPUs favor tensorflow, everything here is pytorch.
I'm curious as well. If you get a tpu on Collab, then it's going to be slower than a rtx card of the same level?
I'm curious as well. If you get a tpu on Collab, then it's going to be slower than a rtx card of the same level?
It’s said that TPU is faster when doing inferences, but not training.
Just tested on a local RTX2060 6G vs. on a collab T4 12G
The 2060 appears to be ~25% faster when doing text2image and image2image.
Training I can't test with a 2060 lol
TPU seems to be good at generating images in parralel. It would be very nice to have such compatibility.
Is there any development on this?
Diffusers officially supports TPU, so I'm guessing it's not a complete rehaul to add it. However, since it's FLAX, I'm not sure exactly how it would be done.