Clay Mullis
Clay Mullis
Going to give this a test run. Thanks for investigating all of this!
@lucidrains I believe there's a PR with a fix for this one.
@mallorbc it would be helpful you could post some examples. Short of that, all I can say is that you'll need to decrease your learning rate by very small amounts...
Nice I think that was my bad. I always use save-progress.
Yeah I believe that's how it turned out. Ultimately just not testing with the CLI (I'm using the notebook) as I dont have a great GPU locally and not testing...
From the readme: > #### High GPU memory usage > If you have at least 16 GiB of vram available, you should be able to run these settings with some...
Yeah it's no worries! Welcome to Python and the machine learning community! Assuming you are a.) On linux, b.) Using an Nvidia GPU, c.) Have cuda installed properly: Here's how...
In general though: there's currently a GPU shortage meaning that cloud providers kind of control the machine learning market. This stuff needs lots of VRAM and unfortunately, colab _does_ make...
> I would use Linux but I'm a gamer, all games support win10 but very few support Linux, or I would use it. > Oh wow, I can relate to...
I don't give that out on here unfortunately. Glad I could help. If you were to join the EleutherAI channel, you might find me there. That's about all I'm willing...