mlc-llm
mlc-llm copied to clipboard
Phi 3 128
🚀 Feature
Introduce phi 3 mini 128k instruct
https://huggingface.co/microsoft/Phi-3-mini-128k-instruct
Mini can run on phones (there is a 4k version and 128k version)
Motivation
Alternatives
Additional context
r u run phi3 ok?
r u run phi3 ok?
What do you mean?
The 128k version could allow us to analyze a pdf from the phone. the 4k version would probably be less cpu intensive and would work out of the box.
sorry.i want to known that are yiu run phi3 mini model on the phone?i i run it with onnxruntime genai but it's too slow
Well I use phi2 all the time using ml. Android on my phone with Snapdragon 2nd gen and it works really good and smooth. That is why I would like to know how I can put phi 3 on mlc.
Have you tried following the instructions for building new models?
Tried following the instructions, and got the error:
ValueError: Unknown model type: phi3. Available ones: ['llama', 'mistral', 'gemma', 'gpt2', 'mixtral', 'gpt_neox', 'gpt_bigcode', 'phi-msft', 'phi', 'qwen', 'qwen2', 'stablelm', 'baichuan', 'internlm', 'rwkv5', 'orion', 'llava', 'rwkv6', 'chatglm', 'eagle']
I would try to add the string "phi3" to the list
@kripper thanks for your advice, the MODELS is not really just a list with a model name, it is a dict with the model name as key and the corresponding loader and implementation.
I've tried specify the --model-type phi-msft
but without luck.
I would first research if the model loader is compatible with previous versions. If so, map the ID to the existing loader. Otherwise, create a new loader and have fun hacking, ie:
- research what changed by comparing (diff) the official loader for the previous and new versions.
- replicate those changes in mlc-llm's loader
Always prefere to share a same code base for the old and new model and use conditions (if(version == 1) {...} else if(version == 3) {...}
).
Once this is figured out please update the android app apk with phi3. ❤️
Folks, we have supported Phi-3 mini 4k/128k and you can find the pre-converted models at https://huggingface.co/mlc-ai. Phi-3 is available in the Android app and iOS app now.
Closing this issue due to the completeness.
I tried it, but it was impossible to ask any questions, the android app is crashing so badly that even my whole phone became unstable and crashed.
It's totally unusable on a samsung flip 5
Has anyone else tried it?
On top of that, the phi 2 is gone, so I have nothing left now.
On Tue, 11 Jun 2024 at 21:56, Ruihang Lai @.***> wrote:
Folks, we have supported Phi-3 mini 4k/128k and you can find the pre-converted models at https://huggingface.co/mlc-ai. Phi-3 is available in the Android app and iOS app now.
Closing this issue due to the completeness.
— Reply to this email directly, view it on GitHub https://github.com/mlc-ai/mlc-llm/issues/2205#issuecomment-2161504939, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB7MPQUCHPEOVIMGPUO5MNLZG5I45AVCNFSM6AAAAABGVI2W4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRRGUYDIOJTHE . You are receiving this because you authored the thread.Message ID: @.***>