transformers.js
transformers.js copied to clipboard
Support for GLiNER models?
Question
is there a reason why models from the GLiNER family can't be supported?
I see they use a specialized library, does it take a lot of code to make them work?
Hi there 👋 I looked into this a few months ago, and indeed it's a bit more complicated than most models: https://github.com/urchade/GLiNER/issues/34#issuecomment-2037412469. They do have a conversion script (https://github.com/urchade/GLiNER/blob/main/examples/convert_to_onnx.ipynb), but there's still a lot of work you need to do yourself for pre- and post-processing. Is this something you'd be interested in looking into?
ha. I realized I already saw that thread when I first found GLiNER and I even remember noticing your comment there. But I completely forgot about it, my bad.
Is this something you'd be interested in looking into?
- I haven't even experimented with the python version yet, it's on my shortlist of things to test.
- I'm far from an expert with onnx/wasm stuff - but maybe in the future?
Thank you!
@xenova
I am interested in Gliner to ONNX and have extensive exp with Transformers.js v3 in CFW. I am using it in ai-research-agent
See here: https://airesearch.js.org/functions/getEmbeddingModel.html
https://airesearch.js.org/functions/addEmbeddingVectorsToIndex.html
What would be a good strategy to port the GliNER code? The ONNX model is available, but the "predict_token" pipeline is a bit of a different story. It requires access to the tokenizer (which I think is ok), mapping tokens in and out (since it's bi-directional), and running against the model. I'd love to give this a try but sure how to start. @xenova any pointer? I would love to get this to run in the browser.
For everyone interested in JS version of GLiNER here it is: https://github.com/Ingvarstep/GLiNER.js