ipex-llm
ipex-llm copied to clipboard
Phi3 3.8B mini 128k model not supported
I got an error when trying to load the Phi3 3.8B mini 128k model using the ipex-llm's support for llama.cpp:
The model that I tried is this: https://ollama.com/library/phi3:3.8b-mini-128k-instruct-q4_0
Hi @lumurillo , we have reproduced this issue and will inform you when we make progress.
@lumurillo We have fixed this issue, please update your ipex-llm to latest nightly build, and try again.