FlagEmbedding icon indicating copy to clipboard operation
FlagEmbedding copied to clipboard

Retrieval and Retrieval-augmented LLMs

Results 622 FlagEmbedding issues
Sort by recently updated
recently updated
newest added

### **run code:** ``` CUDA_VISIBLE_DEVICES=0 nohup torchrun --nproc_per_node 1 --nnodes 1 --node_rank 0 --master_addr "localhost" --master_port 10003 \ run_ds_cirr.py \ --output_dir ./output/ \ --bge_model_name_or_path ./BAAI/bge-m3 \ --visual_model_name_or_path EVA02-CLIP-L-14 \ --dataloader_num_workers...

Hi I'm trying to try out RetroMAE pretraining of your model on my domain data. Do you make available the encoder MLM head and decoder you used during pretraining stage...

我的训练脚本如下: ``` #!/bin/bash # -------------------------- # 时间戳生成 # -------------------------- export TIMESTAMP=$(date +"%Y%m%d_%H%M%S") # 时间戳格式:年月日_时分秒 # -------------------------- # 基础路径配置 # -------------------------- export HOME_DIR="/inno-vepfs/languoxing" export TRAIN_DATA="${HOME_DIR}/datasets/train_data_no_hn_dedup_shuffle456_202410_to_202503.jsonl" export EVAL_DATA="${HOME_DIR}/datasets/dev_data_no_hn_dedup.jsonl" # -------------------------- # 模型参数...

I have an almost offline server. Now I have downloaded some datasets, but I don't know how to use them

Added an inline comment to setup.py to improve code clarity. Added an inline comment to `setup.py` explaining its purpose, enhancing code clarity.

rt https://arxiv.org/abs/[2205.13147](https://arxiv.org/abs/2205.13147)

What is the difference between the encoder_only script and the decoder_only script if we use `last_token` as pooling method and `m3_kd_loss` as loss? Can lora be used with encoder only?...

能提供最新的https://huggingface.co/BAAI/BGE-VL-v1.5-mmeb这个模型在MMEB测试集的评估示例吗? 目前看评估代码还都是clip的模型,没有mllm相关的。

standard model训练正常,m3模型训练报错 FlagEmbedding/finetune/embedder/encoder_only/m3/modeling.py teacher_scores = teacher_scores.view(q_dense_vecs.size(0), -1).detach()后添加.float() 不然会报错 RuntimeError: "host_softmax" not implemented for 'Long'

看下命令中没有保存训练日志和训练状态日志的参数,只有report to