mistral-inference
mistral-inference copied to clipboard
TinyMistral? small llm for phones and computers with no gpu?
Hi, is there any plan to release a good performance small model 1B/2B/3B like TinyLlama, phi-2, etc
Most people want to run open source llms on local for specific tasks but have no gpu, having these smalls models that can inference fast enough with low resources ( smartphones, no gpu, etc ) has lot of demand and can be use for finetuning specific tasks.
Thanks