GLiNER icon indicating copy to clipboard operation
GLiNER copied to clipboard

Onnx runtime error with input_shape_size == size was false

Open gsasikiran opened this issue 8 months ago • 9 comments

When i try to load and run the onnx model, I am getting the following error message. I ran the code from https://github.com/urchade/GLiNER/blob/main/examples/convert_to_onnx.ipynb to save as onnx model.

This is how I am loading the model model = GLiNER.from_pretrained("gliner_multi-v2.1", load_onnx_model=True, load_tokenizer=True)

2024-06-28 10:07:16.986464308 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running Reshape node. Name:'/span_rep_layer/span_rep_layer/Reshape_2' Status Message: /onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:45 onnxruntime::ReshapeHelper::ReshapeHelper(const onnxruntime::TensorShape&, onnxruntime::TensorShapeVector&, bool) input_shape_size == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{1,36,512}, requested shape:{1,17,12,512}

Traceback (most recent call last): File "/home/project_path/src/extract_locations.py", line 138, in extract_location entities = model.predict_entities(normalized_phrase.lower(), labels, threshold=0.3) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/project_path/lib/python3.11/site-packages/gliner/model.py", line 176, in predict_entities return self.batch_predict_entities( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/project_path/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/project_path/lib/python3.11/site-packages/gliner/model.py", line 198, in batch_predict_entities model_output = self.model(**model_input)[0] ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/project_path/lib/python3.11/site-packages/gliner/onnx/model.py", line 59, in call return self.forward(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/project_path/lib/python3.11/site-packages/gliner/onnx/model.py", line 87, in forward inference_output = self.run_inference(prepared_inputs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/project_path/lib/python3.11/site-packages/gliner/onnx/model.py", line 47, in run_inference onnx_outputs = self.session.run(None, inputs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user/anaconda3/envs/query-intent-recognition/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 220, in run return self._sess.run(output_names, input_feed, run_options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Reshape node. Name:'/span_rep_layer/span_rep_layer/Reshape_2' Status Message: /onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:45 onnxruntime::ReshapeHelper::ReshapeHelper(const onnxruntime::TensorShape&, onnxruntime::TensorShapeVector&, bool) input_shape_size == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{1,36,512}, requested shape:{1,17,12,512}

gsasikiran avatar Jun 28 '24 08:06 gsasikiran