Qwen2 Deployment by Ollama fail
Qwen2 Deployment by Ollama fail,prompt"Ollama_llama_server无法找到入口"
Test environment:Ultra 5 125H CPU,Win11 23H2 Pro,gfx driver-32.0.101.5972
These are the pip list in my container:
Installation Steps:
- install Miniforge3-Windows-x86_64.exe,VSCodeUserSetup-x64-1.89.1.exe,w_BaseKit_p_2024.1.0.595.exe
- 管理员打开Miniforge Prompt
- conda create -n llm-cpp python=3.11
- conda activate llm-cpp
- pip install --pre --upgrade ipex-llm[cpp] call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat“
- cd D:\ipex-handson
- mkdir ollama
- cd ollama
- init-ollama.bat
- set OLLAMA_NUM_GPU=999 set no_proxy=localhost,127.0.0.1 set ZES_ENABLE_SYSMAN=1 set SYCL_CACHE_PERSISTENT=1
- ollama serve
- 管理员再重新打开一个Miniforge Prompt
- conda activate llm-cpp
- call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat“
- cd D:\ipex-handson\ollama
- ollama.exe pull qwen2:7b
- curl http://localhost:11434/api/generate -d "{"model": "qwen2:7b", "prompt": "Why is the sky blue?", "stream": false,"options": {"num_predict": 100}} “
https://1drv.ms/i/s!Ao-yo1PGhfPm9jlpyNltSUCei41h [https://buyvwq.bl.files.1drv.com/y4mgC1mTDn3ArCRGF1FtXc3guYEMODNU7dscmB2y9ZFe5HzOkS_OJ0b69-cncjlUT_lF_FG7hOIOkSjEQyPZQ4pi9OVMuzpUHJKPfETwO6eP0tNSoysWsJGg5ryAtUOtAtHqHQBxrf_p4b-lCwTP1gZ2SPms-e4MqMbqQMaqgzRU4WVxkhLv9exkPjeKfdJKm-fDBfb5PtG2IZmHRLe6uqscg?width=200&height=150&cropmode=center] [https://res.public.onecdn.static.microsoft/assets/mail/file-icon/png/cloud_blue_16x16.png]llm-cpp pip list.png pip list
发件人: SONG Ge @.> 发送时间: 2024年10月16日 17:09 收件人: intel-analytics/ipex-llm @.> 抄送: vincent-wsz @.>; Mention @.> 主题: Re: [intel-analytics/ipex-llm] Qwen2 Deployment by Ollama fail (Issue #12210)
Hi @vincent-wszhttps://github.com/vincent-wsz , could you please run pip list and show me the results?
— Reply to this email directly, view it on GitHubhttps://github.com/intel-analytics/ipex-llm/issues/12210#issuecomment-2416188424, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BMDZJQNJPJZYWDMTOJGWOO3Z3YULHAVCNFSM6AAAAABP6X2IB2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMJWGE4DQNBSGQ. You are receiving this because you were mentioned.Message ID: @.***>
Hi @vincent-wsz, please follow our official document to install ollama, and you may run ollama without call "C:\Program Files (x86)\Intel\oneAPI\setvars.bat“.