How to convert last.pt model to .onnx model
I executed:
img = torch.ones((1, 3, 300, 17))
torch.onnx.export(model, img, 'action.onnx', verbose=False, opset_version=12, input_names=['images'],
output_names=['output'])
Then I got following error:
/code/mmskeleton/mmskeleton/ops/st_gcn/gconv_origin.py:57: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
assert A.size(0) == self.kernel_size
Traceback (most recent call last):
File "/usr/local/bin/mmskl", line 7, in s. Send a bug report.
Could you give any advise on it? Thank you.
It occured becasue of torch.einsum which was not supported to export as onnx op
It occured becasue of torch.einsum which was not supported to export as onnx op
How to solve this problem
I executed: img = torch.ones((1, 3, 300, 17)) torch.onnx.export(model, img, 'action.onnx', verbose=False, opset_version=12, input_names=['images'], output_names=['output']) Then I got following error: /code/mmskeleton/mmskeleton/ops/st_gcn/gconv_origin.py:57: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! assert A.size(0) == self.kernel_size Traceback (most recent call last): File "/usr/local/bin/mmskl", line 7, in exec(compile(f.read(), file, 'exec')) File "/code/mmskeleton/tools/mmskl", line 123, in main() File "/code/mmskeleton/tools/mmskl", line 117, in main call_obj(**cfg.processor_cfg) File "/code/mmskeleton/mmskeleton/utils/importer.py", line 24, in call_obj return import_obj(type)(**kwargs) File "/code/mmskeleton/mmskeleton/processor/recognition.py", line 46, in test output_names=['output']) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/init.py", line 132, in export strip_doc_string, dynamic_axes) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 64, in export example_outputs=example_outputs, strip_doc_string=strip_doc_string, dynamic_axes=dynamic_axes) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 329, in _export _retain_param_name, do_constant_folding) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 225, in _model_to_graph _disable_torch_constant_prop=_disable_torch_constant_prop) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 127, in _optimize_graph graph = torch._C._jit_pass_onnx(graph, operator_export_type) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/init.py", line 163, in _run_symbolic_function return utils._run_symbolic_function(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py", line 577, in _run_symbolic_function n.kindOf("value"))) RuntimeError: Unsupported prim::Constant kind:
s. Send a bug report. Could you give any advise on it? Thank you.
How do you solve this problem