Kaushik B
Kaushik B
Hi @jhoareau! This is likely a liptpu version mismatch. Could you try the below steps? ```python sudo rm -rf /usr/local/lib/python3.8/dist-packages/libtpu* sudo pip3 install torch_xla[tpuvm] ```
@Borda Seems like I can't resolve the conflicts as `https://github.com/shenmishajing/pytorch-lightning.git` doesn't exist. Do you know a solution for this?
@jadielam Hey Jadiel! Loved your work on Videoflow, would love to contribute if you need any help regarding this issue or anything else!
@Borda Why not add both the loggers?
Thanks @carmocca! Hi @pjspol! Is this issue also occurring with single TPU core?
Waiting for #14821 to get in
@dynamix @martindurant Any updates on the PR? The inconsistency is causing issues for us while working on remote filesystems. [Ref](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/loggers/tensorboard.py#L272)
Also, not so pretty hack of using `setattr` for defining Works
@yurijmikhalevich should i assign this issue to you?
The error is thrown when the path to the record files is not right. I would recommend adding the whole path to the train and test records file in the...