magika icon indicating copy to clipboard operation
magika copied to clipboard

Explicitly set the providers with InferenceSession

Open fr0gger opened this issue 1 year ago • 3 comments

Minor issue when running the code. ORT 1.9 requires setting the providers parameter when using InferenceSession.

ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)

I resolved the ValueError by explicitly adding the providers parameter in the _init_onnx_session(self) method.

fr0gger avatar Feb 20 '24 01:02 fr0gger

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

google-cla[bot] avatar Feb 20 '24 01:02 google-cla[bot]

Thanks! I'm not an expert on ORT providers. What's AzureExecutionProvider? Any reason to not go just with CPUExecutionProvider? Last question: we have not encountered this problem before, is this triggered with some new version of onnxruntime or so?

reyammer avatar Feb 20 '24 09:02 reyammer

Thanks @reyammer for your quick reply, the error came with the current version when running Magika on Windows. By default, in the code the list was empty, and the error came out. The error specified ['AzureExecutionProvider', 'CPUExecutionProvider'] as this is the one enabled according to the error.

Also, the documentation specified that the provider must be added since version 1.10.

https://onnxruntime.ai/docs/api/python/api_summary.html

image

fr0gger avatar Feb 20 '24 23:02 fr0gger

Hi, I work on onnxruntime. Providing an execution provider in the provider list is optional for the last few versions. Moreover, you never have to specify "AzureExecutionProvider" unless you're using remote inferencing capabilities.

PS C:\Users\prs> python
Python 3.10.11 (tags/v3.10.11:7d4cc5a, Apr  5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime as ort
>>> s = ort.InferenceSession(r"C:\Users\prs\work_projects\onnxruntime\onnxruntime\test\testdata\mul_1.onnx")
>>>

pranavsharma avatar Feb 21 '24 02:02 pranavsharma

Thank you both for chipping in!

From the snippet of the docs above, I read "Running on CPU is the only time the API allows no explicit setting of the provider parameter."... which is exactly our case? (I don't have tests for GPUs, never really tried it out).

But I still don't get the source of this bug. It seems very Windows-specific? And if yes, how so? the version of onnxruntime you get on windows should be the same we are testing for linux-based distros?

reyammer avatar Feb 21 '24 08:02 reyammer

From the snippet of the docs above, I read "Running on CPU is the only time the API allows no explicit setting of the provider parameter."... which is exactly our case? (I don't have tests for GPUs, never really tried it out).

Yes, this is correct. If you don't require a GPU, you don't have to specify any provider.

But I still don't get the source of this bug. It seems very Windows-specific? And if yes, how so? the version of onnxruntime you get on windows should be the same we are testing for linux-based distros?

The code is the exact same on both Windows and Linux unless you've different versions of ORT on them.

pranavsharma avatar Feb 21 '24 08:02 pranavsharma

Thanks @pranavsharma! Then let's wait for @fr0gger for more input.

reyammer avatar Feb 21 '24 11:02 reyammer

Thanks guys for looking into this! I'm not exactly sure about the root of the issue, but it might have something to do with the Python version I'm using, which is 3.11.8?

To give you a clearer picture, here's what happens when I don't specify the providers:

PS C:\Users\Documents> python
Python 3.11.8 (tags/v3.11.8:db85d51, Feb  6 2024, 22:03:32) [MSC v.1937 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime as ort
>>> s = ort.InferenceSession(r"mul_1.onnx")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 432, in __init__
    raise e
  File "C:\Users\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "C:\Users\AppData\Local\Programs\Python\Python311\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 451, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['AzureExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['AzureExecutionProvider', 'CPUExecutionProvider'], ...)
>>>

Here is what happens when I specify the providers:

PS C:\Users\Documents> python
Python 3.11.8 (tags/v3.11.8:db85d51, Feb  6 2024, 22:03:32) [MSC v.1937 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime as ort
>>> s = ort.InferenceSession(r"model.onnx", providers=['AzureExecutionProvider', 'CPUExecutionProvider'])
>>> 

I also tried with only 'CPUExecutionProvider' which might be a better usage since Magika is not using Azure?

>>> s = ort.InferenceSession(r"model.onnx", providers=['CPUExecutionProvider'])
>>>

Let me know if you can reproduce, I will remove the Azure provider. :)

fr0gger avatar Feb 21 '24 23:02 fr0gger

It's ok to use just the CPUExecutionProvider in the list. I assume you've an older version of ORT than what I have. I'm running with ORT 1.17.0.

pranavsharma avatar Feb 21 '24 23:02 pranavsharma

Perfect! You are right I am using version 1.16.0 image

fr0gger avatar Feb 21 '24 23:02 fr0gger

Thanks! In the meantime we have bumped the version of onnxruntime for the python package, but being explicit about the execution provider does not seem a bad idea regardless. Merged, thank you!

reyammer avatar Feb 22 '24 13:02 reyammer