Zhipeng Wang

Results 13 comments of Zhipeng Wang

It appears to be a network issue. Does it occur only on Phi 4 14.7b or on all NPU models?

Closed due to no response for over a month.

> [@timenick](https://github.com/timenick) Is this fix included in the published version? I checked again and still see the same problem using **v0.16.0**. > > To be sure, I deleted all the...

Hi @pkbullock , could you please try the latest version 0.14.2 of AITK to see if the issue is resolved? Thanks

Hi @DanielGoehler, could you try using the pre-release version of AITK and test the model again? Thank you. ![Image](https://github.com/user-attachments/assets/aee6e979-38af-40a1-8df4-5bb8d71f3b2f)

> [@timenick](https://github.com/timenick) With prerelease version 0.15.2025062307, I am still encountering the same error message as before: > > ``` > 2025-06-24 06:44:47.809 [error] Failed loading model Phi-4-mini-reasoning-3.8b-qnn. Could not find...

@leestott Windows 25H2 issue is fixed. Could you please try using version 0.7.117 again to see if the EP downloads successfully? Thank you.

Could you check if the following path exists: C:\Users\\.aitk\bin\libonnxruntime_cuda_windows\0.0.3? BTW, please also check if you have a nvidia gpu on your device.

Hi @pts1989, AITK in VSCode and Foundry local use different default model cache folders. Both allow you to change the default cache folder if you want to share the models....

> > That’s strange — normally, after installation and opening the model dialog, the agent server should start automatically. Could you please check whether Inference.Service.Agent.exe exists in your extension installation...