can't run ollama on docker
As a ollama user with not much money in my pocket, I am curious about this issue, but searched "ollama" or "docker" in the repo didn't lead me to nowhere. What's the issue? How can I test or reproduce it?
As a ollama user with not much money in my pocket, I am curious about this issue, but searched "ollama" or "docker" in the repo didn't lead me to nowhere. What's the issue? How can I test or reproduce it?
Hey @teamolhuang, last time I tested, I couldn't get Ollama to work with the Docker deployment. Try following the Docker setup guide here: https://www.surfsense.net/docs/docker-installation with Ollama, you should start seeing the issues in the logs.
I hope it help fresh install and am not expert but this is how i manage to use ollama the only thing when i upload a pdf file it will not upload ... https://i.imgur.com/7X8U1DK.png
backend .env
Database Configuration
DATABASE_URL="postgresql+asyncpg://postgres:postgres@db:5432/surfsense"
Secret Key
SECRET_KEY="put your key, i remove it"
Frontend URL
NEXT_FRONTEND_URL="http://192.168.1.180:3008"
Authentication Configuration
AUTH_TYPE="LOCAL"
Google OAuth (empty for local auth)
GOOGLE_OAUTH_CLIENT_ID="" GOOGLE_OAUTH_CLIENT_SECRET=""
Embedding Model
EMBEDDING_MODEL="sentence-transformers/all-MiniLM-L6-v2"
Reranker Configuration
RERANKERS_MODEL_NAME="ms-marco-MiniLM-L-12-v2" RERANKERS_MODEL_TYPE="flashrank"
LLM Configuration - OLLAMA ONLY
FAST_LLM="ollama/qwen3:32b" STRATEGIC_LLM="ollama/deepseek-r1:32b" LONG_CONTEXT_LLM="ollama/llama3.3:70b"
TTS/STT Services - DISABLE OR USE OLLAMA
TTS_SERVICE="" STT_SERVICE=""
API Keys - EMPTY (we're using Ollama)
OPENAI_API_KEY="" GEMINI_API_KEY="" FIRECRAWL_API_KEY=""
File Parser Service - DISABLE for now
ETL_SERVICE="" UNSTRUCTURED_API_KEY="" LLAMA_CLOUD_API_KEY=""
LangSmith Observability - DISABLE
LANGSMITH_TRACING=false LANGSMITH_ENDPOINT="" LANGSMITH_API_KEY="" LANGSMITH_PROJECT=""
CRITICAL: Ollama API Base URLs
FAST_LLM_API_BASE="http://192.168.1.173:11434" STRATEGIC_LLM_API_BASE="http://192.168.1.173:11434" LONG_CONTEXT_LLM_API_BASE="http://192.168.1.173:11434" TTS_SERVICE_API_BASE="" STT_SERVICE_API_BASE=""
CORS Settings
ALLOWED_ORIGINS=["http://192.168.1.180:3008","http://localhost:3008"]
API Configuration
API_HOST=0.0.0.0 API_PORT=8008
Enable basic file processing
ETL_SERVICE="UNSTRUCTURED"
You can leave this empty for now - it might work without API key for basic files
UNSTRUCTURED_API_KEY=""
frontend/web .env NEXT_PUBLIC_FASTAPI_BACKEND_URL=http://192.168.1.180:8008 NEXTAUTH_URL=http://192.168.1.180:3008 NEXT_PUBLIC_FASTAPI_BACKEND_AUTH_TYPE=LOCAL NEXT_PUBLIC_ETL_SERVICE=UNSTRUCTURED_API_KEY NEXTAUTH_SECRET=put your key, i remove it
main folder .env
Frontend Configuration
FRONTEND_PORT=3008 NEXT_PUBLIC_API_URL=http://192.168.1.180:8008
Backend Configuration
BACKEND_PORT=8008
Database Configuration
POSTGRES_USER=postgres POSTGRES_PASSWORD=postgres POSTGRES_DB=surfsense POSTGRES_PORT=5432
pgAdmin Configuration
PGADMIN_PORT=5050 [email protected] PGADMIN_DEFAULT_PASSWORD=surfsense
docker-compose.yml version: '3.8'
services: db: image: ankane/pgvector:latest ports: - "${POSTGRES_PORT:-5432}:5432" volumes: - postgres_data:/var/lib/postgresql/data environment: - POSTGRES_USER=${POSTGRES_USER:-postgres} - POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-postgres} - POSTGRES_DB=${POSTGRES_DB:-surfsense} healthcheck: test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-postgres}"] interval: 30s timeout: 10s retries: 3
backend: build: context: ./surfsense_backend dockerfile: Dockerfile ports: - "0.0.0.0:${BACKEND_PORT:-8008}:8000" volumes: - ./surfsense_backend:/app - backend_uploads:/app/uploads environment: - DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/surfsense - OLLAMA_BASE_URL=http://192.168.1.173:11434 - API_HOST=0.0.0.0 - API_PORT=8000 env_file: - ./surfsense_backend/.env depends_on: db: condition: service_healthy restart: unless-stopped
frontend: build: context: ./surfsense_web dockerfile: Dockerfile ports: - "0.0.0.0:${FRONTEND_PORT:-3008}:3000" volumes: - ./surfsense_web:/app - /app/node_modules - /app/.next environment: - NEXT_PUBLIC_API_URL=http://192.168.1.180:${BACKEND_PORT:-8008} - NEXTAUTH_URL=http://192.168.1.180:${FRONTEND_PORT:-3008} env_file: - ./surfsense_web/.env depends_on: - backend restart: unless-stopped
pgadmin: image: dpage/pgadmin4 ports: - "${PGADMIN_PORT:-5050}:80" environment: - PGADMIN_DEFAULT_EMAIL=${PGADMIN_DEFAULT_EMAIL:[email protected]} - PGADMIN_DEFAULT_PASSWORD=${PGADMIN_DEFAULT_PASSWORD:-surfsense} volumes: - pgadmin_data:/var/lib/pgadmin depends_on: - db
volumes: postgres_data: pgadmin_data: backend_uploads:
@AbdullahAlMousawi Hi thanks for the help here. You are not able to upload pdf because you have not configured any ETL service (File Parser Service - DISABLE for now), I am guessing you need full local setup. I will be adding Docling support soon.
Can I be assigned to this?
Can I be assigned to this?
@mimisavage Thanks for your interest. Right now docket setup might be broken due to recent changes. You might need to fix that as well. Anyway assigning it to you for now.
Docker + Ollama work perfectly for me ... but there is a bug in the code should be fixed ... @MODSetter do you want me to send you the fix ...
Docker + Ollama work perfectly for me ... but there is a bug in the code should be fixed ... @MODSetter do you want me to send you the fix ...
@AbdullahAlMousawi Which OS are you running? I tried Docker on macOS yesterday, but it failed, so I need to look into that. Go ahead and raise the PR for the fix 👍
I use both WSL in windows 11 and Unraid Linux ...
I use both WSL in windows 11 and Unraid Linux ...
Well if its working on Linux then idk why its failing on MacOS....will try to fix that soon.