llm
llm copied to clipboard
Model installation issue locally
Hi @simonw Thanks so much for all the hard work, and this is a fantastic resource.
There's a possibility I'm doing something wrong but let me try to explain what I see.
Problem
- I was trying to run the mistral model locally on my computer.
- I use poetry installation of the llm model.
I get the following errors.
100%|██████████████████| 1.98G/1.98G [01:33<00:00, 21.2MiB/s]
Error: Unable to instantiate model: Model format not supported (no matching implementation found)```
llm -m mistral-7b-instruct-v0 'difference between a pelican and a walrus'
Error: Unable to instantiate model: Model format not supported (no matching implementation found)
I ran the following. Is it due to the fact that I'm using an M3 chip?
system_profiler SPSoftwareDataType SPHardwareDataType
Software:
System Software Overview:
System Version: macOS 14.3 (23D56)
Kernel Version: Darwin 23.3.0
Boot Volume: Macintosh HD
Boot Mode: Normal
Computer Name: Peadar’s MacBook Pro
User Name: User (user)
Secure Virtual Memory: Enabled
System Integrity Protection: Enabled
Time since boot: 31 minutes, 58 seconds
Hardware:
Hardware Overview:
Model Name: MacBook Pro
Model Identifier: Mac15,3
Model Number: Z1C8000FDB/A
Chip: Apple M3
Total Number of Cores: 8 (4 performance and 4 efficiency)
Memory: 16 GB
System Firmware Version: 10151.81.1
OS Loader Version: 10151.81.1
Serial Number (system): G6M22DC462
Hardware UUID: F2514639-4A87-5F30-AA1B-375B3ECB9DEF
Provisioning UDID: 00008122-001638590881401C
Activation Lock Status: Disabled
Hope this helps.