SONG Ge

Results 166 comments of SONG Ge

hi @dayskk , could you please run the ENV-Check script in https://github.com/intel-analytics/ipex-llm/tree/main/python/llm/scripts and reply us with the results?

Hi @SG-Python1 , 1. Could you provide the output of `ollama serve` when doing model inference. 2. You may run the ENV-Check script in https://github.com/intel-analytics/ipex-llm/tree/main/python/llm/scripts and reply us with the...

Hi @yu1chaofan , you may set `export BIGDL_IMPORT_IPEX=0` before launching the webui.

> Am I missing some package? Or am I doing something wrong? `set BIGDL_IMPORT_IPEX=0` only works in cmd, you may run `$env:BIGDL_IMPORT_IPEX="0"` in powershell.

Hi @yu1chaofan , we have fixed this issue and released a new version of webui, you may update to our latest version and try again.

hi @wluo1007 , we have reproduced your issue. We will inform you once we make progress.

Hi @wluo1007 , you may set `export BIGDL_IMPORT_IPEX=0` before launching the webui.

@niceTeen84 , `python server.py --load-in-4bit --listen --listen-host 0.0.0.0 --listen-port 8080`

Hi @sundeepChandhoke , yes ollama is expecting a gpu and it does not support running on a device without any gpu currently.

Thank you for using IPEX-LLM. We will discuss the support for Ubuntu 24.04 and update the corresponding document.