litgpt
litgpt copied to clipboard
Add MPS configs
It would be nice to also have some configs for MPS machines (like Ollama does). I can run some small representative models on my Macbook.
Hi @rasbt , is pretraining on MPS accelerator not supported.
I am getting below error on Macbook M1.
`Verifying settings ... /Users/mvohra/miniforge3/envs/litgpt/lib/python3.12/site-packages/torch/amp/autocast_mode.py:265: UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling
torch._dynamo.exc.BackendCompilerFailed: backend='inductor' raised:
AssertionError: Device mps not supported `
Thanks for the note. I am not sure if we ever supported MPS devices for pretraining. We can take a look some time, but I don't have a timeline for this due to some other things being a bit higher on the priority list.
Thanks for that @rasbt . Is there a way to pretrain on CPUs for a tiny model to just do sanity tests?
Yes, it should work on CPU devices
I think there might be a dependency issue with enabling MPS support, Pytorch 2.x does not have MPS support however Pytorch 1.x does. I tried installing litgpt with torch < 2.0.0
and got the following error:
ERROR: Cannot install litgpt and torch<2.0.0 because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested torch<2.0.0
litgpt 0.4.11 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.10 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.9 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.8 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.7 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.6 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.5 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.4 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.3 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.2 depends on torch>=2.2.0
The user requested torch<2.0.0
litgpt 0.4.1 depends on torch>=2.2.0
@rasbt Is torch 2.2.x
a hard dependency? I'm working through your recent coding workshop and would love to follow along on my M1 Mac.
P.S Thanks for all the amazing work you do! I'm a long time follower of your newsletter.
Hi there,
I think it might be an issue with the latest release version (see https://github.com/rasbt/LLM-workshop-2024/discussions/4 for more details). In the meantime, I think that it should work with
pip install litgpt==0.4.9 -U
I can confirm that the issue is not with the torch
and mps
: