text-embeddings-inference
text-embeddings-inference copied to clipboard
The mirror image of 1.6 starts the gte-multilingual-reranker-base and encounters an error
System Info
Use docker run -it --rm --gpus 'device=0' -p 8016:80 -v XXX ...inference:1.6 --model-id .../gte-mutilingual-reranker-base . Could not start Candle backend :could not start backend:cannot find tensor embedding.work_embdding.weight Error: Could not create backend cased by start backend:Could not start a suitable backend
Information
- [X] Docker
- [ ] The CLI directly
Tasks
- [X] An officially supported command
- [ ] My own modifications
Reproduction
"It also cannot run on version 1.5."
Expected behavior
Use docker run -it --rm --gpus 'device=0' -p 8016:80 -v XXX ...inference:1.6 --model-id .../gte-mutilingual-reranker-base . Could not start Candle backend :could not start backend:cannot find tensor embedding.work_embdding.weight Error: Could not create backend cased by start backend:Could not start a suitable backend