LazyCat
LazyCat
If I want to get more epochs/checkpoints to save would it be better to increase batch size or increase epochs? I'm guessing if I have the vram I could increase...
> For windows users and nvidia GPU: You need to have Python (3.10) installed as well as git. > > Scripts: https://gist.github.com/DanPli/98c05281a749254e5b572238825d3617 and https://gist.github.com/DanPli/0bfcaed38ed678adf3cd3f6aa0420f46 > > I created the setup...
running into the same problem as well I followed the instructions above and got it to run but it uses the gpu for only a second then the CPU starts...
I reinstalled ctransformers 0.2.9 but it only worked when I removed the quotes. I tried fixing it in conda and venv and ran into the same issues. I'm running cuda...
I got gpu to work on GPTQ I would suggest trying that if you haven't yet. It was using 12GB of vram and 95% of the card the whole time....
is it possible to finetune the model on a 3090 or do we have to do a lora due to the size?