SpanMarkerNER
SpanMarkerNER copied to clipboard
SpanMarker with ONNX models
Hi @tomaarsen! Is there a ONNX exporter planned? Have you tried using SpanMarker with ONNX models for inference? Would be really curious if you experimented with that already! :-)
Hello!
I have done a very quick experiment to try and export SpanMarker to ONNX, but I got some incomprehensible errors. I don't have the experience with ONNX at the moment to quickly create such an exporter.
- Tom Aarsen
Hi @tomaarsen , I would like to collaborate with this issue.
That would be awesome! I'm open to PRs on the matter.
This would indeed be a nice feature to add. We export all our models to ONNX before deploying and this is unfortunately not currently possible with SpanMarker.
Keep up the good work!
@tomaarsen , can you upload ONNX format for Span Marker.
I'm afraid I haven't been able to convert SpanMarker models to ONNX yet.
Hello @tomaarsen . I am independently working on converting span_marker models to the ONNX format and I have started it on a new branch. I would like to share the results to see if we can make progress on it. How would you like to proceed?
Hello!
Awesome! I'd love to get ONNX support for SpanMarker somehow.
You can fork the repository and push your branch there. Then, you can open a draft pull request of your branch from your fork into the main
branch of this repository, and we'll be able to discuss there, look at results, etc. GitHub actions will then automatically run the tests from your branch to make sure everything is working well.
Does that sound good?
- Tom Aarsen
Great!
I will push the branch this weekend as soon as I can.
ONNX support would be amazing! One can also quantize the models for further inference speed optimization once the base models are converted to ONNX. It is essentially 5 lines of code from ONNX to quantized ONNX.
@tomaarsen @polodealvarado is ONNX implementation done? how to load the models with onnx for faster inference?can you please help here?