R2R
R2R copied to clipboard
Ingest 404 when using ollama
trafficstars
Title
Ollama Embedding Provider returns 404 error during ingestion despite successful curl test
Description
When using Ollama as the embedding provider for R2R, the ingestion process fails with a 404 error. However, Ollama is running correctly and responds to curl requests as expected.
Steps to Reproduce
- Start Ollama locally (Ollama is running on
http://localhost:11434) - Configure R2R to use Ollama as the embedding provider (see configuration below)
- Attempt to ingest a document through the R2R interface
Expected Behavior
R2R should successfully connect to Ollama and generate embeddings for the ingested document.
Actual Behavior
R2R fails to connect to Ollama, returning a 404 error:
2024-09-15 13:29:51,205 - INFO - core.main.r2r - Starting R2R with version 3.0.8
**External Ollama instance detected and responsive.**
2024-09-15 13:31:59,086 - ERROR - core.providers.embeddings.ollama - Error getting embeddings: 404 page not found
2024-09-15 13:31:59,086 - WARNING - core.base.providers.embedding - Request failed (attempt 1): Error getting embeddings: 404 page not found
2024-09-15 13:32:00,091 - ERROR - core.providers.embeddings.ollama - Error getting embeddings: 404 page not found
2024-09-15 13:32:00,091 - WARNING - core.base.providers.embedding - Request failed (attempt 2): Error getting embeddings: 404 page not found
2024-09-15 13:32:00,094 - ERROR - core.base.pipeline.base_pipeline - Pipeline failed with error: Error getting embeddings: 404 page not found
Configuration
Here's the relevant part of the TOML configuration used:
[embedding]
provider = "ollama"
base_model = "mxbai-embed-large"
base_dimension = 512
batch_size = 128
add_title_as_prefix = false
rerank_model = "None"
concurrent_request_limit = 256
Environment
- R2R Version: 3.1.0 (using local version with poetry)
- Operating System: Ubuntu
Additional Information
Ollama is running correctly and responds to curl requests. Here's a successful curl test:
curl http://localhost:11434/api/embeddings -d '{
"model": "mxbai-embed-large",
"prompt": "Llamas are members of the camelid family"
}'
Response:
{"embedding":[0.5869068503379822,1.1740394830703735,0.6426560282707214,0.8012367486953735,-0.13272863626480103,0.646376371383667,-0.44366782903671265,0.8876819610595703,0.6493498682975769,-0.24569851160049438,0.6341419816017151,0.3442591726779938,-0.6307538151741028,0.1385183334350586,-0.521152138710022,0.6995735168457031,0.21664012968540192,-0.16669663786888123,-0.141....