Phi-3CookBook icon indicating copy to clipboard operation
Phi-3CookBook copied to clipboard

md\07.Labs\Csharp\src\LabsPhi301 fails

Open IntranetFactory opened this issue 1 year ago • 14 comments
trafficstars

I'm trying md\07.Labs\Csharp\src\LabsPhi301 on a new Copilot+ laptop. I adjusted modelPath to point to the correct folder.

When I run the lab I get:
Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'

After the first failure I updated all nuget packages, but still same result.

Should that sample work on a Copilot+ laptop?

IntranetFactory avatar Jul 06 '24 23:07 IntranetFactory

Hi @IntranetFactory please ensure you have the latest drivers installed for your NPU

Resolution

  1. Update drivers see https://github.com/intel/intel-npu-acceleration-library/tree/main
  2. We have validated there are no problem in the example, please also refer to https://github.com/intel/intel-npu-acceleration-library/blob/main/examples/phi-3.py for another validation check

leestott avatar Jul 09 '24 09:07 leestott

I'm using a Copilot+ PC with SnapDragon X CPU - will the intel-npu-library work in that case?

IntranetFactory avatar Jul 09 '24 09:07 IntranetFactory

@IntranetFactory Thanks for the confirmation on your running a Qualcomm device see https://learn.microsoft.com/en-us/windows/ai/npu-devices/ for the latest drivers and Onnx runtime support info https://learn.microsoft.com/windows/ai/npu-devices/

leestott avatar Jul 09 '24 09:07 leestott

Qualcomm Snapdragon X: Currently, developers should target the Qualcomm QNN Execution Provider (EP), which uses the Qualcomm AI Engine Direct SDK (QNN). Pre-built packages with QNN support are available to download. This is the same stack currently used by the Windows Copilot Runtime and experiences on Copilot+ PC Qualcomm devices.

leestott avatar Jul 09 '24 09:07 leestott

I'm sorry it's the first time I use QNN - what does "target the Qualcomm QNN EP" mean? Do I just need to install that provider or do I also need to modify the cookbook (e.g. change CPU, install nuget packages)?

IntranetFactory avatar Jul 09 '24 09:07 IntranetFactory

I installed Microsoft.ML.OnnxRuntime.QNN nuget package. When I select "Any CPU" I get "System.DllNotFoundException: 'Unable to load DLL 'onnxruntime-genai' or one of its dependencies: The specified module could not be found. (0x8007007E)'". in var model = new Model(modelPath);

When I select ARM64 architecture I get a build errors

image

IntranetFactory avatar Jul 09 '24 09:07 IntranetFactory

@IntranetFactory

You are experiencing a DllNotFoundException when trying to use the Microsoft.ML.OnnxRuntime.QNN NuGet package with the “Any CPU” configuration. This error typically occurs when the DLL ‘onnxruntime-genai’ or one of its dependencies is not found by the system.

To resolve this issue, you might want to ensure that:

The NuGet package is properly installed and that all required dependencies are included. The project is configured to copy the native dependencies to the output directory. The native dependencies are compatible with the architecture you’re targeting. If you’re still facing issues, you might consider adding a issues to the Onnx Runtime repo asking them to ensure compatibility with the native dependencies for Snapdragon.

Additionally, checking the documentation for the Microsoft.ML.OnnxRuntime.QNN package for any specific installation or configuration instructions might provide further guidance to see if anyone else has encountered a build error when selecting the ARM64 architecture after installing the same package.

I would suggest reaching out to the maintainers of the Microsoft.ML.OnnxRuntime.QNN package or seek support from the community, as they might have encountered and resolved similar issues.

leestott avatar Jul 10 '24 08:07 leestott

@IntranetFactory Thank you for your question. I would like to explain that the current example ONNX for Generative AI is based on the x86 framework, and will support the ARM64 architecture in the future. You can learn about the roadmap through the GitHub Repo https://github.com/microsoft/onnxruntime-genai. If you are using Copilot + PC for ARM64, it is recommended that you use Phi-Silica to call https://learn.microsoft.com/en-us/windows/ai/apis/phi-silica

kinfey avatar Jul 10 '24 09:07 kinfey

I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel Phi Silica and OCR APIs are not included in this release. These will be coming in a future 1.6 release. or is there any other way to get that?

IntranetFactory avatar Jul 10 '24 09:07 IntranetFactory

Are just the C# samples not working on ARM64? So should Python work or does Phi-3 currently not work on ARM64 at all?

IntranetFactory avatar Jul 10 '24 09:07 IntranetFactory

The GenAI nugets don't support Arm64 currently, there is an issue tracking this here: https://github.com/microsoft/onnxruntime-genai/issues/637.

Adding @natke for awareness

nmetulev avatar Jul 10 '24 20:07 nmetulev

please use this https://www.nuget.org/packages/Microsoft.ML.OnnxRuntime.QNN

kinfey avatar Jul 19 '24 06:07 kinfey

@kinfey I tried that already, which causes build errors https://github.com/microsoft/Phi-3CookBook/issues/84#issuecomment-2217213880

IntranetFactory avatar Jul 19 '24 08:07 IntranetFactory

I would love to try phi-scilica - but it seems that it's also not available https://learn.microsoft.com/en-us/windows/apps/windows-app-sdk/experimental-channel Phi Silica and OCR APIs are not included in this release. These will be coming in a future 1.6 release. or is there any other way to get that?

Does anyone know more on this?

meliolabsadmin avatar Sep 11 '24 23:09 meliolabsadmin