litgpt
litgpt copied to clipboard
TypeError: BFloat16 is not supported on MPS
Getting this when running Falcon 7b model on M1 Pro, is there a specific version that supports this on M1?
Command that was run:
python generate/base.py --prompt "Hello, my name is" --checkpoint_dir checkpoints/tiiuae/falcon-7b
Document that was referred:
https://github.com/Lightning-AI/lit-parrot/blob/main/howto/download_falcon.md
Hi. Can you add the error you got?
Probably: BFloat16 is not supported on MPS
Yes, BFloat16 is not supported on MPS.
Adding more details about the error below
`python3 generate/base.py --prompt "Hello, my name is" --checkpoint_dir checkpoints/tiiuae/falcon-7bLoading model 'checkpoints/tiiuae/falcon-7b/lit_model.pth' with {'block_size': 2048, 'vocab_size': 50254, 'padding_multiple': 512, 'padded_vocab_size': 65024, 'n_layer': 32, 'n_head': 71, 'n_embd': 4544, 'rotary_percentage': 1.0, 'parallel_residual': True, 'bias': False, 'n_query_groups': 1, 'shared_attention_norm': True} Traceback (most recent call last): File "/Users/amal/Development/lit-parrot/generate/base.py", line 204, in
CLI(main) File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/jsonargparse/cli.py", line 85, in CLI return _run_component(component, cfg_init) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/jsonargparse/cli.py", line 147, in _run_component return component(**cfg) ^^^^^^^^^^^^^^^^ File "/Users/amal/Development/lit-parrot/generate/base.py", line 147, in main model = Parrot(config) ^^^^^^^^^^^^^^ File "/Users/amal/Development/lit-parrot/lit_parrot/model.py", line 26, in init self.lm_head = nn.Linear(config.n_embd, config.padded_vocab_size, bias=False) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/torch/nn/modules/linear.py", line 96, in init self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/torch/utils/_device.py", line 76, in torch_function return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/lightning/fabric/utilities/init.py", line 55, in torch_function return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ TypeError: BFloat16 is not supported on MPS`
You can pass --precision 16-true
or --precision 32-true
instead
Thanks, that worked