fast-stable-diffusion icon indicating copy to clipboard operation
fast-stable-diffusion copied to clipboard

Support for TPU?

Open Portareumbra opened this issue 2 years ago • 4 comments

There are certain online notebooks that provide TPU, which would make this even faster. Problem is, that it needs explicit code to run

Portareumbra avatar Nov 05 '22 21:11 Portareumbra

we need more feedback on this matter to adapt the notebook to TPUs

TheLastBen avatar Nov 06 '22 04:11 TheLastBen

TPUs are now available on Kaggle, for free. TPUs are hardware accelerators specialized in deep learning tasks. They are supported in Tensorflow 2.1 both through the Keras high-level API and, at a lower level, in models using a custom training loop.

You can use up to 30 hours per week of TPUs and up to 9h at a time in a single session.
Once you have flipped the "Accelerator" switch in your notebook to "TPU v3-8", this is how to enable TPU training in Tensorflow Keras:

    
# detect and init the TPU
tpu = tf.distribute.cluster_resolver.TPUClusterResolver.connect()

# instantiate a distribution strategy
tpu_strategy = tf.distribute.experimental.TPUStrategy(tpu)

# instantiating the model in the strategy scope creates the model on the TPU
with tpu_strategy.scope():
    model = tf.keras.Sequential( … ) # define your model normally
    model.compile( … )

# train model normally
model.fit(training_dataset, epochs=EPOCHS, steps_per_epoch=…)

https://www.kaggle.com/docs/tpu

If we can transition the notebook to Kaggle, or at least add support for TPUs we can make the dreambooth process even faster

Portareumbra avatar Nov 06 '22 06:11 Portareumbra

the problem with kaggle is that it doesn't offer an interface (form) like google does.

TheLastBen avatar Nov 06 '22 06:11 TheLastBen

Is there a way to repurpose your code for use with TPUs my friend?

Portareumbra avatar Nov 06 '22 19:11 Portareumbra

CleanShot 2022-12-21 at 23 48 46@2x i have access to TPU in colab too, can that be option ?

riderx avatar Dec 21 '22 15:12 riderx