polodealvarado
polodealvarado
Hi @ScottishFold007 ! Could you get better results ?
Hi @tomaarsen , I would like to collaborate with this issue.
Hello @tomaarsen . I am independently working on converting span_marker models to the ONNX format and I have started it on a new branch. I would like to share the...
Great! I will push the branch this weekend as soon as I can.
Now running `onnx_implementation.py` works. The ONNX validation is done with a tolerance error of less than 0.0001. ``` ================ Diagnostic Run torch.onnx.export version 2.0.1 ================ verbose: False, log level: Level.ERROR...
Given that "num_words", "num_marker_pairs", ... etc are not computed by the onnx graph I think it does not affect to the perfomance, but I will run some tests just in...
The first draft for the SpanMarkerOnnxPipeline is ready. I am facing issues with the batch size for the input onnx. For some reason the onnx model only processes one batch...
Mmmm I have found a similar issue here: https://discuss.pytorch.org/t/dynamic-axes-doesnt-work-for-torch-onnx-export-when-torch-cat-is-present/149501 Actually we have torch.cat implemented into the _forward_ function.
Hello @tomaarsen. I have figured out how to make it work. Tomorrow I will share with you how to generate the ONNX models. Now I am facing the challenge of...
Here I have a first solution, however it doesn't improve the inference time compare with the original torch model. As we discussed previously about that "ugly" for-loop , we were...