onnxruntime icon indicating copy to clipboard operation
onnxruntime copied to clipboard

Compatibility between Onnx and Blazor Webassembly

Open orponce opened this issue 2 years ago • 5 comments

Describe the issue

I am trying to use Onnx in Blazor Webassembly, in order to do inference within a browser. I am not sure if there might be a compatibility issue between versions since I am not even able to load an inference session. I have been looking for answers or sample code but it seems to be nonexistent for onnx with Blazor Webassembly in a browser. From the error that I get, it seems that onnxruntime is not being properly found when running in the browser (https://localhost:port).

I am also not sure if this issue should be raised here or if there might be another more specific place to do it.

To reproduce

I have the following Index.razor file, where I am creating an inference session to load the onnx model (in this example, it is called "example_model.onnx", but it could be any model). The model is located within the wwwroot directory with property Content (copy, always copy, or copy newer actually give the same error): 

@page "/"
@using Microsoft.ML.OnnxRuntime;
@using Microsoft.ML.OnnxRuntime.Tensors;

<button @onclick="MakePrediction">Make Prediction</button> 

@code {   
    private void MakePrediction()
    {
        InferenceSession session = new InferenceSession("wwwroot/example_model.onnx");
    }
}

Part of the configuration file for the project is the following:

<Project Sdk="Microsoft.NET.Sdk.BlazorWebAssembly">
<PropertyGroup>
    <TargetFramework>net6.0</TargetFramework>
    <Nullable>enable</Nullable>
    <ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore.Components.WebAssembly" Version="6.0.11" />
    <PackageReference Include="Microsoft.AspNetCore.Components.WebAssembly.DevServer" Version="6.0.11" PrivateAssets="all" />
    <PackageReference Include="Microsoft.ML.OnnxRuntime" Version="1.14.1" />
</ItemGroup>

However, when I try to run it in the browser (actually, when I press the button "Make Prediction"), I get the following error (in the browser DevTools console):

crit: Microsoft.AspNetCore.Components.WebAssembly.Rendering.WebAssemblyRenderer[100]
      Unhandled exception rendering component: The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception.
System.TypeInitializationException: The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception.
---> System.DllNotFoundException: onnxruntime
   at Microsoft.ML.OnnxRuntime.NativeMethods..cctor()
   --- End of inner exception stack trace ---
   at Microsoft.ML.OnnxRuntime.SessionOptions..ctor()
   at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath)
   at BlazorWasmOnnxAttempt2.Pages.Counter.MakePrediction() in C:\Users\devel\csharp\BlazorWasmOnnx\Pages\Counter.razor:line 11
   at Microsoft.AspNetCore.Components.EventCallbackWorkItem.InvokeAsync[Object](MulticastDelegate delegate, Object arg)
   at Microsoft.AspNetCore.Components.EventCallbackWorkItem.InvokeAsync(Object arg)
   at Microsoft.AspNetCore.Components.ComponentBase.Microsoft.AspNetCore.Components.IHandleEvent.HandleEventAsync(EventCallbackWorkItem callback, Object arg)
   at Microsoft.AspNetCore.Components.EventCallback.InvokeAsync(Object arg)
   at Microsoft.AspNetCore.Components.RenderTree.Renderer.DispatchEventAsync(UInt64 eventHandlerId, EventFieldInfo fieldInfo, EventArgs eventArgs)

I have tried with Edge, Firefox and Chrome with a similar result.

Urgency

No response

Platform

Windows

OS Version

10

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.14.1

ONNX Runtime API

C#

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

orponce avatar Mar 08 '23 16:03 orponce

The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception. ---> System.DllNotFoundException: onnxruntime

The onnxruntime.dll from the Microsoft.ML.OnnxRuntime nuget package doesn't appear to be in a location where it can be loaded.

The Microsoft.ML.OnnxRuntime.Managed nuget package contains the ONNX Runtime C# bindings and calls the native code in onnxruntime.dll. The namespace in Microsoft.ML.OnnxRuntime.Managed is also called Microsoft.ML.OnnxRuntime which makes things slightly confusing.

The error is coming from the C# bindings class Microsoft.ML.OnnxRuntime.NativeMethods when it tries to setup the infrastructure to call the native code. This involves onnxruntime.dll needing to be loaded by the .net infrastructure.

Not familiar with how a Blazor project builds etc. to know a) why the dll isn't handled automatically if Microsoft.ML.OnnxRuntime is referenced (could be that it ignores native dlls); or b) how best to fix it, but I'd suggest looking for info on how to make sure required dlls are included so they can be loaded at runtime.

skottmckay avatar Mar 13 '23 23:03 skottmckay

Is this an actual supported use case? Using the .Net OnnxRuntime for WASM Projects? I dont think it is. If you want to use it in Web I guess you would need to add NPM to your Blazor Project and then use the ONNX Runtime Web directly in JS. But at that point you could just create a React app or something like that as you lose all the advantages of Blazor (Coding in C# for example) anyway as you can not use any of the .Net parts of the ONNX Runtime but need to do all of the Inferencing in JS.

It would be nice if this could be supported somehow as then we could write Web (WASM) Projects Fully in C# and still do ML tasks.

SirBitesalot avatar May 03 '23 18:05 SirBitesalot

It's possible to build onnxruntime as static WASM library and link as NativeFileReference. Performance is not great however.

Object detection for BUS

Blazor Webassembly: Preprocess: 4,0277, Inference: 6,093, Postprocess: 0,8253 NET Console: Preprocess: 0,1110885, Inference: 0,1405309, Postprocess: 0,0620846

The web version tested with this repro yolov8-onnxruntime-web is way faster as well.

b-straub avatar Oct 13 '23 20:10 b-straub

After enabling SIMD I get for WASM: Preprocess: 2,886, Inference: 1,278, Postprocess: 0,6944 Maybe with the upcoming thread support it will improve further.

b-straub avatar Oct 14 '23 15:10 b-straub

@skottmckay could you potentially take a look at my above issue in the ML.NET repo and help point me in the right direction?

I understand in general that this isn't a supported scenario, but my hope is that it's just not supported yet and something can get implemented. I'm willing to help out where I can.

Eddie-Hartman avatar Aug 28 '24 01:08 Eddie-Hartman

Applying stale label due to no activity in 30 days