FlagEmbedding
FlagEmbedding copied to clipboard
Retrieval and Retrieval-augmented LLMs
RuntimeError: CUDA error: CUBLAS_STATUS_EXECUTION_FAILED when calling `cublasLtMatmul( ltHandle, computeDesc.descriptor(), &alpha_val, mat1_ptr, Adesc.descriptor(), mat2_ptr, Bdesc.descriptor(), &beta_val, result_ptr, Cdesc.descriptor(), result_ptr, Cdesc.descriptor(), &heuristicResult.algo, workspace.data_ptr(), workspaceSize, at::cuda::getCurrentCUDAStream())`
搜索引擎结合问题
请问一下~ dense embedding可以结合向量库的搜索引擎,例如milvus、faiss进行使用。 那么,sparse embedding有没有什么可以推荐使用的搜索引擎或者快速检索方式和思路?
我使用ES的script_score检索可以基本实现稀疏向量检索,目前来看速度也还可以,几乎是毫秒级返回结果 若有其他方法心得,期待大家的讨论
首先感谢BGE提供的这么强大的模型, 我们在之前的RAG应用中, 对企业的文档通过BGE将资料转成向量存储到milvus中, 用户查询时通过dense retrieve 从milvus召回文档,我们使用的是langchain作为框架. 请问如果用BGE配合使用做 dense+sparse retrieve 有没有例子可以参考. 目前是在不知道如何进行. 万分感谢.
Hello, I see your example dense+sparse embeddings of BGE-M3. I have updated the pymilvus using `git+https://github.com/milvus-io/pymilvus` But when I run the example script, I get an error. ``` sparse_index =...
模型保存时有问题
Traceback (most recent call last): File "/opt/tiger/FlagEmbedding/FlagEmbedding/baai_general_embedding/finetune/run.py", line 114, in main() File "/opt/tiger/FlagEmbedding/FlagEmbedding/baai_general_embedding/finetune/run.py", line 103, in main trainer.train() File "/usr/local/lib/python3.9/dist-packages/transformers/trainer.py", line 1624, in train return inner_training_loop( File "/usr/local/lib/python3.9/dist-packages/transformers/trainer.py", line 2049,...
Thanks for your release of new reranker model, but I noticed in https://github.com/FlagOpen/FlagEmbedding/blob/25c30a853d93434724929ac83136cd8dda24291a/FlagEmbedding/flag_reranker.py#L345-L348 Considering that many users and projects still operate on earlier Python versions(below 3.10), I believe adjusting the...
Could you please provide details about the process used to train the reranker model available at https://huggingface.co/BAAI/bge-reranker-v2-m3? Specifically, I'm interested in the pipeline employed for training, the dataset utilized, as...