drewskidang

Results 22 issues of drewskidang

lora_target_modules='["query_key_value"]' "not part of this model"

I just finished pretraining BAAI/bge-base ... is it possible to use the llm-embedder training script on the same model or os the the embedder model different

can we append the model output into a json/jsonl??

question

Is there multigpu support ? Don't know how to set up without running a script

currently fixing
on roadmap
feature request

Not sure how to generate Sparse Embeddings when using langchain.

is there a way for the model to be uploaded on hugging face so we can gpu accelerators like vllm

enhancement

Is it posssible to train on more than one gpu https://huggingface.co/BAAI/bge-m3 currently ooming on this model. BGE allows to load fp16 but cant do it with sefit

Trying to get the original document along with the annoated LLM TASK together in a string is that possible?

Is it possible train mixtral in this repo

enhancement

Is there away to pretrain the M_3 models?