Add Docker configuration for containerized deployment
Description
This PR introduces a complete Docker-based deployment solution for Resume-Matcher, providing containerized environments for both development and production. The configuration eliminates the need for complex local environment setup by containerizing all services including the FastAPI backend, Next.js frontend, and Ollama AI service.
Relevant issue
https://github.com/srbhr/Resume-Matcher/issues/415
Type
- [ ] Bug Fix
- [x] Feature Enhancement
- [x] Documentation Update
- [ ] Code Refactoring
- [ ] Other (please specify):
Proposed Changes
- Added Dockerfile.backend for FastAPI application with uv package manager
- Added Dockerfile.frontend for Next.js application with standalone output
- Created docker-compose.yml for production environment setup
- Created docker-compose.dev.yml for development environment with hot-reload
- Introduced .dockerignore file to optimize Docker build context
- Added comprehensive DOCKER.md documentation
- Updated next.config.ts to support standalone builds and dynamic API URLs
- Added health checks for all services in docker-compose configurations
- Configured automated Ollama setup with gemma3:4b model initialization
- Implemented proper service dependencies and startup orchestration
- Added Docker volume persistence for data and AI models
Screenshots / Code Snippets (if applicable)
# docker-compose.yml - Simple production setup
services:
ollama:
image: ollama/ollama:latest
ports:
- "11434:11434"
volumes:
- ollama_data:/root/.ollama
backend:
build:
context: .
dockerfile: Dockerfile.backend
ports:
- "8000:8000"
depends_on:
ollama:
condition: service_healthy
frontend:
build:
context: .
dockerfile: Dockerfile.frontend
ports:
- "3000:3000"
depends_on:
backend:
condition: service_healthy
# Simple Docker commands - no scripts needed
docker-compose build # Build images
docker-compose up -d # Start production
docker-compose -f docker-compose.dev.yml up -d # Start development
docker-compose down # Stop services
How to Test
- Prerequisites: Install Docker and Docker Compose
-
Clone repository:
git clone <repo> && cd Resume-Matcher -
Build images:
docker-compose build -
Start production:
docker-compose up -d - Wait for initialization: Allow 5-10 minutes for AI model download on first run
-
Verify services:
- Frontend: http://localhost:3000
- Backend: http://localhost:8000/docs (API documentation)
- Ollama: http://localhost:11434
-
Test development mode:
docker-compose down docker-compose -f docker-compose.dev.yml up -d - Verify hot-reload: Edit frontend/backend code and confirm changes reflect automatically
-
Clean up:
docker-compose down -vto remove containers and volumes
Checklist
- [x] The code compiles successfully without any errors or warnings
- [x] The changes have been tested and verified
- [x] The documentation has been updated (if applicable)
- [x] The changes follow the project's coding guidelines and best practices
- [x] The commit messages are descriptive and follow the project's guidelines
- [x] All tests (if applicable) pass successfully
- [ ] This pull request has been linked to the related issue (if applicable)
Additional Information
Key Benefits:
- π³ Zero local dependencies: No need to install Python, Node.js, Ollama, or uv locally
- π Simple deployment: Standard
docker-compose up -dcommand - π Development-friendly: Hot-reload support in development mode
- π€ AI-ready: Automated Ollama setup with gemma3:4b model
- π Cross-platform: Works identically on Windows, macOS, and Linux
- π¦ Optimized: Multi-stage Dockerfiles with minimal production images
- π Production-ready: Health checks, proper user permissions, and service dependencies
Technical Implementation:
- Backend: Python 3.12 + FastAPI + uv package manager
- Frontend: Node.js 18 + Next.js with standalone output
- AI: Ollama with automated gemma3:4b model download
- Database: SQLite with Docker volume persistence
- Architecture: Multi-service with proper dependency management
File Structure:
βββ Dockerfile.backend # FastAPI container (Python + uv)
βββ Dockerfile.frontend # Next.js container (standalone build)
βββ docker-compose.yml # Production environment
βββ docker-compose.dev.yml # Development environment
βββ .dockerignore # Build context optimization
βββ DOCKER.md # Setup and usage documentation
Breaking Changes:
- None - this is purely additive functionality
- Existing local development workflow remains unchanged
Resource Requirements:
- Minimum: 8GB RAM, 10GB disk space
- Recommended: 16GB RAM, 20GB disk space
- First run: Additional time for AI model download (~4GB)
Summary by CodeRabbit
-
New Features
- Added comprehensive Docker support for both production and development, including Dockerfiles, Compose files, and a .dockerignore for optimized builds.
- Introduced health checks and persistent storage for backend, frontend, and AI services.
- Enhanced environment configuration and service orchestration for reliable startup and scaling.
- Provided new npm scripts for streamlined Docker management.
-
Documentation
- Added a detailed Docker setup and troubleshooting guide.
- Updated the README with Docker installation instructions and references to further documentation.
-
Configuration
- Improved frontend configuration for dynamic API URL handling and standalone output.
Walkthrough
A comprehensive Docker setup was introduced for the Resume Matcher project. This includes new Dockerfiles for backend and frontend, production and development Docker Compose files, a detailed Docker setup guide, and an extensive .dockerignore. The frontend Next.js configuration and project scripts were updated to support Dockerized workflows, with new documentation in the README.
Changes
| File(s) | Change Summary |
|---|---|
.dockerignore |
Added file listing patterns to exclude unnecessary files from Docker build context. |
DOCKER.md |
New documentation detailing Docker setup, configuration, troubleshooting, performance, security, and scaling. |
Dockerfile.backend, Dockerfile.frontend |
Added Dockerfiles for backend (Python) and frontend (Node.js/Next.js) with multi-stage builds and health checks. |
docker-compose.yml, docker-compose.dev.yml |
Added Compose files defining services, dependencies, volumes, health checks for production and development. |
README.md |
Enhanced with Docker installation instructions, quick start, and references to new Docker documentation. |
apps/frontend/next.config.ts |
Updated Next.js config: added output: 'standalone' and dynamic API URL rewrites for Docker compatibility. |
package.json |
Added npm scripts for Docker build, up, down, dev, logs, and clean commands. |
Sequence Diagram(s)
sequenceDiagram
participant Dev as Developer
participant Docker as Docker Engine
participant Compose as Docker Compose
participant Ollama as Ollama AI Service
participant Backend as Backend Service
participant Frontend as Frontend Service
Dev->>Docker: Build images (backend, frontend)
Dev->>Compose: Start services (docker-compose up)
Compose->>Ollama: Start Ollama container
Compose->>Ollama: Health check /api/tags
Compose->>Ollama-init: Wait for Ollama healthy, pull model
Compose->>Backend: Start after Ollama & Ollama-init ready
Compose->>Backend: Health check /health
Compose->>Frontend: Start after Backend healthy
Compose->>Frontend: Health check /
Dev->>Frontend: Access web app (port 3000)
Frontend->>Backend: API requests (rewritten via API_URL)
Backend->>Ollama: AI model requests
Estimated code review effort
π― 3 (Moderate) | β±οΈ ~20 minutes
Poem
Docker ships upon the sea,
Compose and files in harmony.
Backend, frontend, Ollama too,
All set sail with something new!
Scripts and docs now lead the wayβ
Rabbitβs paws are here to say:
"Hop aboard, deploy today!" ππ’
[!NOTE]
β‘οΈ Unit Test Generation is now available in beta!
Learn more here, or try it out under "Finishing Touches" below.
β¨ Finishing Touches
π§ͺ Generate unit tests
- [ ] Create PR with unit tests
- [ ] Post copyable unit tests in a comment
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.
πͺ§ Tips
Chat
There are 3 ways to chat with CodeRabbit:
- Review comments: Directly reply to a review comment made by CodeRabbit. Example:
-
I pushed a fix in commit <commit_id>, please review it. -
Explain this complex logic. -
Open a follow-up GitHub issue for this discussion.
-
- Files and specific lines of code (under the "Files changed" tab): Tag
@coderabbitaiin a new review comment at the desired location with your query. Examples:-
@coderabbitai explain this code block. -
@coderabbitai modularize this function.
-
- PR comments: Tag
@coderabbitaiin a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:-
@coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase. -
@coderabbitai read src/utils.ts and explain its main purpose. -
@coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format. -
@coderabbitai help me debug CodeRabbit configuration file.
-
Support
Need help? Create a ticket on our support page for assistance with any issues or questions.
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.
CodeRabbit Commands (Invoked using PR comments)
-
@coderabbitai pauseto pause the reviews on a PR. -
@coderabbitai resumeto resume the paused reviews. -
@coderabbitai reviewto trigger an incremental review. This is useful when automatic reviews are disabled for the repository. -
@coderabbitai full reviewto do a full review from scratch and review all the files again. -
@coderabbitai summaryto regenerate the summary of the PR. -
@coderabbitai generate docstringsto generate docstrings for this PR. -
@coderabbitai generate sequence diagramto generate a sequence diagram of the changes in this PR. -
@coderabbitai generate unit teststo generate unit tests for this PR. -
@coderabbitai resolveresolve all the CodeRabbit review comments. -
@coderabbitai configurationto show the current CodeRabbit configuration for the repository. -
@coderabbitai helpto get help.
Other keywords and placeholders
- Add
@coderabbitai ignoreanywhere in the PR description to prevent this PR from being reviewed. - Add
@coderabbitai summaryto generate the high-level summary at a specific location in the PR description. - Add
@coderabbitaianywhere in the PR title to generate the title automatically.
CodeRabbit Configuration File (.coderabbit.yaml)
- You can programmatically configure CodeRabbit by adding a
.coderabbit.yamlfile to the root of your repository. - Please see the configuration documentation for more information.
- If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation:
# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json
Documentation and Community
- Visit our Documentation for detailed information on how to use CodeRabbit.
- Join our Discord Community to get help, request features, and share feedback.
- Follow us on X/Twitter for updates and announcements.
Adding a relevant issue here
Users may use their existing Ollama instance. It may be better to just add a env variable specifying the ollama endpoint.
Users may use their existing Ollama instance. It may be better to just add a env variable specifying the ollama endpoint.
β1 on this
Good point thanks. I will fix it
vaddisrinivas left a comment (srbhr/Resume-Matcher#436) https://github.com/srbhr/Resume-Matcher/pull/436#issuecomment-3116235745
Users may use their existing Ollama instance. It may be better to just add a env variable specifying the ollama endpoint.
β1 on this
β Reply to this email directly, view it on GitHub https://github.com/srbhr/Resume-Matcher/pull/436#issuecomment-3116235745, or unsubscribe https://github.com/notifications/unsubscribe-auth/AT7P7SNFXSQNJ4M2KWFUJPT3KGNV7AVCNFSM6AAAAACCIKJDVCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTCMJWGIZTKNZUGU . You are receiving this because you were mentioned.Message ID: @.***>
Having to provide your own ollama instance rather than having to spin up another ollama container would be nice.
Having to provide your own ollama instance rather than having to spin up another ollama container would be nice.
Sure i am working on that
LGTM
Followed blindly your instruction, I have an error with the backend not connecting to database. Here the logs pastebin.com/raw/BuE83b0X
@umidjon-2231
Followed blindly your instruction, I have an error with the backend not connecting to database. Here the logs pastebin.com/raw/BuE83b0X
@umidjon-2231
Thanks for the report. I will fix it soon. I guess the problem is SQLite file is not created.
@umidjon-2231 are you still working on the issue? can I give my hands
@umidjon-2231 are you still working on the issue? can I give my hands
Hi @mahimairaja , Yes, Iβm still working on it, but help is welcome. Let me know what part youβd like to take on so we can coordinate.
@umidjon-2231 the issue is sql file not created because: As no local sqlite file exists initially , so app.db is mounted as directory by docker. Fixing it my binding parent directory to docker or by creating a separate db storage directory.