Scott McKay
Scott McKay
Awesome. Glad it works. I'll leave this issue open until we figure out a long term fix with the MAUI folks.
> I'm running into this with a new Maui application. I use multiple projects for the View Models and Views (forms). Nothing fancy, just trying to port a working XAMARIN...
That seems like a different issue. The linker problem in the original issue was because CoreML was not included and the ORT library required it. The linker not being able...
Will be fixed in the 1.13 release so closing this issue.
You could take a look at the example Android app we have to see what differences there are between your setup and that working one. https://github.com/microsoft/onnxruntime-inference-examples/tree/main/mobile/examples/image_classification/android
It's more about the build setup than the type of app. In either case the linker needs to be able to find the onnxruntime library. The error you're getting from...
Assuming issue was due to wrong binary being referenced. Please re-open if a problem persists.
/azp run Windows CPU CI Pipeline,Windows GPU CI Pipeline,Windows GPU TensorRT CI Pipeline,ONNX Runtime Web CI Pipeline,Linux CPU CI Pipeline,Linux CPU Minimal Build E2E CI Pipeline,Linux GPU CI Pipeline,Linux GPU...
/azp run MacOS CI Pipeline,orttraining-amd-gpu-ci-pipeline,orttraining-linux-ci-pipeline,orttraining-linux-gpu-ci-pipeline,orttraining-ortmodule-distributed,onnxruntime-python-checks-ci-pipeline,onnxruntime-binary-size-checks-ci-pipeline
Is there doco somewhere on how/when this workflow runs? Not sure if it runs automatically or would it be part of the release process?