ipex-llm
ipex-llm copied to clipboard
Currently, there is no support for gguf-based ollama-ipex-llm inference in jina-embeddings-v3 or jina-reranker-v2-base-multilingual. Do we have any plans to add this support in the future?
Currently, there is no support for gguf-based ollama-ipex-llm inference in jina-embeddings-v3 or jina-reranker-v2-base-multilingual. Do we have any plans to add this support in the future?
ollama-ipex-llm ollama-ipex-llm-2.2.0-ubuntu.tgz
model url: embedding model: https://huggingface.co/jinaai/jina-embeddings-v3 reranker model: https://huggingface.co/jinaai/jina-reranker-v2-base-multilingual
gguf model url: embedding model: https://huggingface.co/AgainstEntropy/jina-embeddings-v3-gguf reranker model: https://huggingface.co/gpustack/jina-reranker-v2-base-multilingual-GGUF