lightning-flash
lightning-flash copied to clipboard
Revisit TPU training
🚀 Feature
just curious, have you used TPU on Kaggle recently? https://www.kaggle.com/code/jirkaborovec/demo-flash-image-classification-on-tpu
Motivation
Kaggle has offered 20 hours of training which is mostly used with TF of Keras as any other lib is rather difficult, so there is huge potential to take this space by Flash... Also With Flash, it shall be trivial to start your work with CPU for exploration and then alternate between GPU and TPU offering to finish with the best-trained model :rabbit:
Pitch
Alternatives
Reason: https://pytorch-lightning.readthedocs.io/en/latest/accelerators/tpu_faq.html#unsupported-datatype-transfer-to-tpus
Unsupported data type transfer to TPUs?
File "/usr/local/lib/python3.8/dist-packages/torch_xla/utils/utils.py", line 205, in _for_each_instance_rewrite
v = _for_each_instance_rewrite(result.__dict__[k], select_fn, fn, rwmap)
File "/usr/local/lib/python3.8/dist-packages/torch_xla/utils/utils.py", line 206, in _for_each_instance_rewrite
result.__dict__[k] = v
TypeError: 'mappingproxy' object does not support item assignment
Additional context
i have the same issue on tpu! env:
- Python 3.7.10
- pytorch 1.9.1