mistral-inference icon indicating copy to clipboard operation
mistral-inference copied to clipboard

TinyMistral? small llm for phones and computers with no gpu?

Open agonzalezm opened this issue 1 year ago • 0 comments

Hi, is there any plan to release a good performance small model 1B/2B/3B like TinyLlama, phi-2, etc

Most people want to run open source llms on local for specific tasks but have no gpu, having these smalls models that can inference fast enough with low resources ( smartphones, no gpu, etc ) has lot of demand and can be use for finetuning specific tasks.

Thanks

agonzalezm avatar Jan 06 '24 11:01 agonzalezm