train stops
I just used the code to train
ai.train(file,
line_by_line=False,
from_cache=False,
num_steps=3000,
generate_every=1000,
save_every=1000,
save_gdrive=False,
learning_rate=1e-3,
fp16=False,
batch_size=1,
)
but the progress is zero after the following comments.
07/13/2021 08:11:56 — INFO — pytorch_lightning.utilities.distributed — GPU available: False, used: False
07/13/2021 08:11:56 — INFO — pytorch_lightning.utilities.distributed — TPU available: False, using: 0 TPU cores
Is this using the Colab Notebook? It looks like it isn't using the GPU which would not play well with training one of the larger models.
I don't have good GPU so thought of using CPU. This is laptop not colab. Seems like colab becomes inactive after some use and has been bothering me.
Ok so it's not stopped, it just extremely slow since it uses only one core of CPU in training. Is there a way to use multiple core? 86% of CPU is idle mostly and I want to make use of them.
Also I have AMD GPU and I want to make use of it if it's possible. Thank you
use keggle
https://www.kaggle.com/code/lostgoldplayer/aitextgen-training-2