oemer
oemer copied to clipboard
Unable to compute the prediction using a neural network model.
Describe the bug
Hi - fresh install crashes here. (Had to force numpy=1.26.4 due to what appears to be a bug in the newly released 2.0.0). Why trying to OCR a png:
Running OSX 14.0 on an M2 Macbook Air.
Input Image
Full Traceback
% oemer -d songtosing.png
2024-06-23 13:45:42 Extracting staffline and symbols
/Users/peterehrlich/projects/omr/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py:69: UserWarning: Specified provider 'CUDAExecutionProvider' is not in available provider names.Available providers: 'CoreMLExecutionProvider, AzureExecutionProvider, CPUExecutionProvider'
warnings.warn(
2024-06-23 13:45:42.915652 [W:onnxruntime:, coreml_execution_provider.cc:104 GetCapability] CoreMLExecutionProvider::GetCapability, number of partitions supported by CoreML: 86 number of nodes in the graph: 1577 number of nodes supported by CoreML: 102
1906 1928
Context leak detected, msgtracer returned -1
Context leak detected, msgtracer returned -1
2024-06-23 13:46:23.331865 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running CoreML_18203898718667227551_0 node. Name:'CoreMLExecutionProvider_CoreML_18203898718667227551_0_0' Status Message: Error executing model: Unable to compute the prediction using a neural network model. It can be an invalid input data or broken/unsupported model (error code: -1).
Traceback (most recent call last):
File "/Users/peterehrlich/projects/omr/bin/oemer", line 8, in <module>
sys.exit(main())
^^^^^^
File "/Users/peterehrlich/projects/omr/lib/python3.12/site-packages/oemer/ete.py", line 276, in main
mxl_path = extract(args)
^^^^^^^^^^^^^
File "/Users/peterehrlich/projects/omr/lib/python3.12/site-packages/oemer/ete.py", line 127, in extract
staff, symbols, stems_rests, notehead, clefs_keys = generate_pred(str(img_path), use_tf=args.use_tf)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/peterehrlich/projects/omr/lib/python3.12/site-packages/oemer/ete.py", line 47, in generate_pred
staff_symbols_map, _ = inference(
^^^^^^^^^^
File "/Users/peterehrlich/projects/omr/lib/python3.12/site-packages/oemer/inference.py", line 69, in inference
out = model.predict(batch) if use_tf else sess.run(output_names, {'input': batch})[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/peterehrlich/projects/omr/lib/python3.12/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 220, in run
return self._sess.run(output_names, input_feed, run_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Command You Execute
oemer -d songtosing.png