machinelearning
machinelearning copied to clipboard
Add mono support
From @alexanderkyte on Apr 27, 2018, 10:58 AM PDT
In order for our models to be useful on mobile platforms, we're going to need to get this working on mono. It'll probably simply be some infrastructure work.
I can address it, as I'm a mono runtime engineer.
Currently on backlog / low-priority
From @eerhardt on Apr 27, 2018, 12:41 PM PDT
Do you know what doesn't work on mono? We are currently producing netstandard2.0
libraries, which should run on any modern .NET platform.
From @alexanderkyte on Apr 30, 2018, 9:57 AM PDT
It's a bit of build hard coding right now. I also need to check that mono's green when doing this with AOT and on the mobile/restricted platforms.
From @eerhardt on Apr 30, 2018, 11:18 AM PDT
Today, we only support x64
architectures (since we are using some SIMD instructions). If you want this to run on mobile, I think it may be some considerable work to support ARM/ARM64
@alexanderkyte is there anything that precludes me fro running this on mono on a Linux x64 machine? Given that I run in JIT mode, and that mono happens to support System.Numerics.Vectors now I think it shoudl work ok but am not entirely sure.
@kumpera
Right now, mono seems perfectly capable of running binaries produced from this repo. The build infrastructure is very closely wedded to dotnetcore though. We fail to AOT the codegen functions, but that's to be expected. Everything else in inference comes through fine. You have to make sure newtonsoft is in the MONO_PATH though. This will probably be better to work with in a statically linked, aot context.
I believe that ARM support is strategic from now on. Mobile and Embedded are just the most obvious examples. But the growing request for ARM servers (due to very low power consumption motherboards) are another example of what will happen soon (IMHO). On my side, I would be perfectly happy if there were ARM libraries for the NetCore 2.1.
Today, we only support x64 architectures
That seems rather problematic for the new ARM64 devices which will only run x86, ARMv7 and ARMv8, but won't be able to run x64 apps. It would be good with UWP support as well.
My main use case is running on ARM (Raspberry Pi 3 or similar) on Linux in .NET Core 2.1. I also have a UWP IoT Core ARM use case. So just another vote for ARM support.
I was not able to run a dotnet core compiled model under mono. Is that expected to work?
DRI RESPONSE : moving to backlog (skip triage) since it requires multiple platform support x86, ARM
I believe that ARM support is strategic from now on. Mobile and Embedded are just the most obvious examples. But the growing request for ARM servers (due to very low power consumption motherboards) are another example of what will happen soon (IMHO). On my side, I would be perfectly happy if there were ARM libraries for the NetCore 2.1.
I totally agree with Rafaelle, hope to see that soon.
Any new information about supporting the ARM32 and ARM64?
Any new information about supporting the ARM32 and ARM64?
Are you looking to use ML.NET in a mobile application with Xamarin? Or on a Linux/Windows ARM32/ARM64 device?
With the latter, you are able to run ML.NET on ARM devices with .NET Core 3.1 for a few algorithms.
However, the ones that don't work are the ones that use C/C++ code since we are not compiling those assemblies for ARM. For example, the following won't work on ARM today:
- LightGBM
- FastTree
- Anything in Mkl.Components
- TensorFlow
- Onnx
- LatentDirichletAllocation
- MatrixFactorization
However, the ones that don't work are the ones that use C/C++ code since we are not compiling those assemblies for ARM. For example, the following won't work on ARM today: [...]
Could you please add this information (and keep it updated) in the readme or in a separate document? One of the difficulties is understanding what can be used and what else not.
Hi @eerhardt , thanks for the reply. I mean the ML.NET on Linux (e.g. Raspberry Pi, NXP I.MX6/7/8) and Windows 10 IoT Core (e.g. Dragonboard 410c, NXP I.MX6/7/8) ARM32 and ARM64 devices. Any new documents about those are appreciated.
Any new information about supporting the ARM32 and ARM64?
Are you looking to use ML.NET in a mobile application with Xamarin? Or on a Linux/Windows ARM32/ARM64 device?
With the latter, you are able to run ML.NET on ARM devices with .NET Core 3.1 for a few algorithms.
However, the ones that don't work are the ones that use C/C++ code since we are not compiling those assemblies for ARM. For example, the following won't work on ARM today:
Hi, Are there any plans to expand the algorithms for ARM devices? I'm trying to use a trained model (Sdca) to predict on a phone using Xamarin, however I'm running into the missing CpuMathNative error, thanks
cc @ericstj
There is no news?
Thanks all for the discussion. Closing this issue since ARM64/M1 support has been available since June 2021.
https://devblogs.microsoft.com/dotnet/ml-net-june-updates-model-builder/#ml-net-on-arm
In addition, moving to the latest versions of .NET, .NET5 or greater, there shouldn't be issues with CpuMathNative.