rust-bert
rust-bert copied to clipboard
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Hello Team, I have the following error when running cargo build - - release, I am pretty sure that libtorch are installed via brew install PyTorch. I do not have...
If I want to use the `SequenceClassifier` pipeline for something like reranking, I am (sort of) able to do so using the exposed `forward_t` method. The problem is that I...
I have made conversions of the model into two supported formats for Rust Bert: one being the OP extension, and the other, the newly supported ONNX format. Despite my endeavors,...
Provide a space for users to request help and support related to using the DeBERTa model.
I am trying to extract keywords from sentences using the `all-MiniLM-L6-v2` model When using this specific sentence (either alone or in combination with other sentences), the keyword extraction fails: `Up...
Using pipline is slower than using python huggingface library transformers generate function, when the model file is loaded, in using CPU envierment.
Hey! I trying to do some task with t5 model. But the issue is that I can't put 600mb of libtorch in my project. Question: Is it possible in this...
Hi, i'm wondering if building a utility that auto downloads any model if missing, similar to the python lib, (and caches it) but also notices there's no `ot` file, and...
As opposed to transformers where labels are generated ad-hoc ``` [{'label': 'LABEL_0', 'score': 0.999602735042572}] ``` To resolve, we might want to add label mapping into `SequenceClassificationConfig` with some defaults, but...
Currently, converting an existing HF model requires having (1) a Rust environment ready, (2) `rust-bert` repo available and, (3) setting up a Python environment, just for the conversion. For the...