text-embeddings-inference
text-embeddings-inference copied to clipboard
Model Request: long context gte models
Model description
We introduce gte-v1.5 series, upgraded gte embeddings that support the context length of up to 8192, while further enhancing model performance. The models are built upon the transformer++ encoder backbone (BERT + RoPE + GLU). The gte-v1.5 series achieve state-of-the-art scores on the MTEB benchmark within the same model size category and prodvide competitive on the LoCo long-context retrieval tests
- https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5
- https://huggingface.co/Alibaba-NLP/gte-base-en-v1.5
Models | Language | Model Size | Max Seq. Length | Dimension | MTEB-en | LoCo |
---|---|---|---|---|---|---|
gte-large-en-v1.5 |
English | 434 | 8192 | 1024 | 65.39 | 86.71 |
gte-base-en-v1.5 |
English | 137 | 8192 | 768 | 64.11 | 87.44 |
Open source status
- [X] The model implementation is available
- [X] The model weights are available
Provide useful links for the implementation
No response