ray-llm
ray-llm copied to clipboard
Support for the Mistral based Embeddings models
For example: https://huggingface.co/intfloat/e5-mistral-7b-instruct
Currently it looks like these models do not support the token_type_ids property which is mandatory in the embedding warmup phase: https://github.com/ray-project/ray-llm/blob/master/rayllm/backend/llm/embedding/embedding_engine.py#L112C14-L112C28
Are you open to PRs on this sort of stuff?