fast-stable-diffusion
fast-stable-diffusion copied to clipboard
Support for TPU?
There are certain online notebooks that provide TPU, which would make this even faster. Problem is, that it needs explicit code to run
we need more feedback on this matter to adapt the notebook to TPUs
TPUs are now available on Kaggle, for free. TPUs are hardware accelerators specialized in deep learning tasks. They are supported in Tensorflow 2.1 both through the Keras high-level API and, at a lower level, in models using a custom training loop.
You can use up to 30 hours per week of TPUs and up to 9h at a time in a single session.
Once you have flipped the "Accelerator" switch in your notebook to "TPU v3-8", this is how to enable TPU training in Tensorflow Keras:
# detect and init the TPU
tpu = tf.distribute.cluster_resolver.TPUClusterResolver.connect()
# instantiate a distribution strategy
tpu_strategy = tf.distribute.experimental.TPUStrategy(tpu)
# instantiating the model in the strategy scope creates the model on the TPU
with tpu_strategy.scope():
model = tf.keras.Sequential( … ) # define your model normally
model.compile( … )
# train model normally
model.fit(training_dataset, epochs=EPOCHS, steps_per_epoch=…)
https://www.kaggle.com/docs/tpu
If we can transition the notebook to Kaggle, or at least add support for TPUs we can make the dreambooth process even faster
the problem with kaggle is that it doesn't offer an interface (form) like google does.
Is there a way to repurpose your code for use with TPUs my friend?
