tranquanghust
tranquanghust
> I want to use BERT for intent classification from text message by considering the conversion history of n messages in a chat session. In addition, there are other features...
> I think this error is caused by faiss. You can use faiss on CPU instead of GPU My data has 7000 queries and 17 keys, how long should it...
> This script uses the huggingface trainer to do fine-tuning, so you can use the hyper-arguments on this page: https://huggingface.co/docs/transformers/main_classes/trainer#transformers.TrainingArguments How can I create a new task like llm embedded...
Can anyone help me, thanks?
> Hi, please try specify `--dtype fp32` in the training script. After finetuning, I tested several cases. The positive samples scored around 0.9, while the negative samples scored around 0.84....
> You can use setfit: https://github.com/huggingface/setfit and huggingface script: https://github.com/huggingface/transformers/tree/main/examples/pytorch/text-classification. Is using "BGE en base" with "setfit" for multi-label classification better than using one BERT model and adding some fully...
> Yes. You need to add this instruction to your data before fine-tuning. I don't know how to add instructions, and I've also searched for documentation on finetuning BGE rerank...
> Hi, @Lahaina936 , if you just want to reranking tools, you can fine-tune using your data without adding instruction. Instruction is only useful when the model needs to perform...
> It already include in batch negative sampling thanks ^^
> It already include in batch negative sampling How can I make the GliNER model biased towards my specific domain data? Because my data domain is prone to confusion with...