torchchat
torchchat copied to clipboard
TorchChat can not be used with weights downloaded from https://llama.meta.com/llama-downloads/
% python3 torchchat.py generate --dtype float16 --checkpoint-path ~/git/meta-llama/llama/llama-2-7b/consolidated.00.pth --tokenizer-path ~/git/meta-llama/llama/tokenizer.model
/Users/nshulga/Library/Python/3.9/lib/python/site-packages/urllib3/__init__.py:35: NotOpenSSLWarning: urllib3 v2 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with 'LibreSSL 2.8.3'. See: https://github.com/urllib3/urllib3/issues/3020
warnings.warn(
Using device=cpu Apple M1 Pro
Loading model...
known configs: ['13B', '70B', 'CodeLlama-7b-Python-hf', '34B', 'stories42M', '30B', 'stories110M', '7B', 'stories15M', 'Mistral-7B', 'Meta-Llama-3-8B']
Time to load model: 14.61 seconds
Traceback (most recent call last):
File "/Users/nshulga/git/pytorch/torchchat/torchchat.py", line 146, in <module>
generate_main(args)
File "/Users/nshulga/git/pytorch/torchchat/generate.py", line 782, in main
_main(
File "/Users/nshulga/git/pytorch/torchchat/generate.py", line 548, in _main
tokenizer_args.validate_model(model)
File "/Users/nshulga/git/pytorch/torchchat/build/builder.py", line 198, in validate_model
raise RuntimeError("no tokenizer was found")