SpanMarkerNER icon indicating copy to clipboard operation
SpanMarkerNER copied to clipboard

SpanMarker with ONNX models

Open Ulipenitz opened this issue 1 year ago • 11 comments

Hi @tomaarsen! Is there a ONNX exporter planned? Have you tried using SpanMarker with ONNX models for inference? Would be really curious if you experimented with that already! :-)

Ulipenitz avatar Aug 15 '23 15:08 Ulipenitz

Hello!

I have done a very quick experiment to try and export SpanMarker to ONNX, but I got some incomprehensible errors. I don't have the experience with ONNX at the moment to quickly create such an exporter.

  • Tom Aarsen

tomaarsen avatar Aug 15 '23 15:08 tomaarsen

Hi @tomaarsen , I would like to collaborate with this issue.

polodealvarado avatar Aug 22 '23 17:08 polodealvarado

That would be awesome! I'm open to PRs on the matter.

tomaarsen avatar Aug 24 '23 08:08 tomaarsen

This would indeed be a nice feature to add. We export all our models to ONNX before deploying and this is unfortunately not currently possible with SpanMarker.

Keep up the good work!

dbuades avatar Oct 31 '23 21:10 dbuades

@tomaarsen , can you upload ONNX format for Span Marker.

abhayalok avatar Nov 01 '23 07:11 abhayalok

I'm afraid I haven't been able to convert SpanMarker models to ONNX yet.

tomaarsen avatar Nov 01 '23 16:11 tomaarsen

Hello @tomaarsen . I am independently working on converting span_marker models to the ONNX format and I have started it on a new branch. I would like to share the results to see if we can make progress on it. How would you like to proceed?

polodealvarado avatar Nov 08 '23 17:11 polodealvarado

Hello!

Awesome! I'd love to get ONNX support for SpanMarker somehow. You can fork the repository and push your branch there. Then, you can open a draft pull request of your branch from your fork into the main branch of this repository, and we'll be able to discuss there, look at results, etc. GitHub actions will then automatically run the tests from your branch to make sure everything is working well. Does that sound good?

  • Tom Aarsen

tomaarsen avatar Nov 08 '23 18:11 tomaarsen

Great!

I will push the branch this weekend as soon as I can.

polodealvarado avatar Nov 10 '23 08:11 polodealvarado

ONNX support would be amazing! One can also quantize the models for further inference speed optimization once the base models are converted to ONNX. It is essentially 5 lines of code from ONNX to quantized ONNX.

ogencoglu avatar Dec 21 '23 12:12 ogencoglu

@tomaarsen @polodealvarado is ONNX implementation done? how to load the models with onnx for faster inference?can you please help here?

ganga7445 avatar Apr 04 '24 06:04 ganga7445