Resume-Matcher icon indicating copy to clipboard operation
Resume-Matcher copied to clipboard

Add Docker configuration for containerized deployment

Open umidjon-2231 opened this issue 5 months ago β€’ 14 comments

Description

This PR introduces a complete Docker-based deployment solution for Resume-Matcher, providing containerized environments for both development and production. The configuration eliminates the need for complex local environment setup by containerizing all services including the FastAPI backend, Next.js frontend, and Ollama AI service.

Relevant issue

https://github.com/srbhr/Resume-Matcher/issues/415

Type

  • [ ] Bug Fix
  • [x] Feature Enhancement
  • [x] Documentation Update
  • [ ] Code Refactoring
  • [ ] Other (please specify):

Proposed Changes

  • Added Dockerfile.backend for FastAPI application with uv package manager
  • Added Dockerfile.frontend for Next.js application with standalone output
  • Created docker-compose.yml for production environment setup
  • Created docker-compose.dev.yml for development environment with hot-reload
  • Introduced .dockerignore file to optimize Docker build context
  • Added comprehensive DOCKER.md documentation
  • Updated next.config.ts to support standalone builds and dynamic API URLs
  • Added health checks for all services in docker-compose configurations
  • Configured automated Ollama setup with gemma3:4b model initialization
  • Implemented proper service dependencies and startup orchestration
  • Added Docker volume persistence for data and AI models

Screenshots / Code Snippets (if applicable)

# docker-compose.yml - Simple production setup
services:
  ollama:
    image: ollama/ollama:latest
    ports:
      - "11434:11434"
    volumes:
      - ollama_data:/root/.ollama
  
  backend:
    build:
      context: .
      dockerfile: Dockerfile.backend
    ports:
      - "8000:8000"
    depends_on:
      ollama:
        condition: service_healthy
  
  frontend:
    build:
      context: .
      dockerfile: Dockerfile.frontend
    ports:
      - "3000:3000"
    depends_on:
      backend:
        condition: service_healthy
# Simple Docker commands - no scripts needed
docker-compose build                              # Build images
docker-compose up -d                              # Start production
docker-compose -f docker-compose.dev.yml up -d   # Start development
docker-compose down                               # Stop services

How to Test

  1. Prerequisites: Install Docker and Docker Compose
  2. Clone repository: git clone <repo> && cd Resume-Matcher
  3. Build images: docker-compose build
  4. Start production: docker-compose up -d
  5. Wait for initialization: Allow 5-10 minutes for AI model download on first run
  6. Verify services:
    • Frontend: http://localhost:3000
    • Backend: http://localhost:8000/docs (API documentation)
    • Ollama: http://localhost:11434
  7. Test development mode:
    docker-compose down
    docker-compose -f docker-compose.dev.yml up -d
    
  8. Verify hot-reload: Edit frontend/backend code and confirm changes reflect automatically
  9. Clean up: docker-compose down -v to remove containers and volumes

Checklist

  • [x] The code compiles successfully without any errors or warnings
  • [x] The changes have been tested and verified
  • [x] The documentation has been updated (if applicable)
  • [x] The changes follow the project's coding guidelines and best practices
  • [x] The commit messages are descriptive and follow the project's guidelines
  • [x] All tests (if applicable) pass successfully
  • [ ] This pull request has been linked to the related issue (if applicable)

Additional Information

Key Benefits:

  • 🐳 Zero local dependencies: No need to install Python, Node.js, Ollama, or uv locally
  • πŸš€ Simple deployment: Standard docker-compose up -d command
  • πŸ”„ Development-friendly: Hot-reload support in development mode
  • πŸ€– AI-ready: Automated Ollama setup with gemma3:4b model
  • 🌐 Cross-platform: Works identically on Windows, macOS, and Linux
  • πŸ“¦ Optimized: Multi-stage Dockerfiles with minimal production images
  • πŸ”’ Production-ready: Health checks, proper user permissions, and service dependencies

Technical Implementation:

  • Backend: Python 3.12 + FastAPI + uv package manager
  • Frontend: Node.js 18 + Next.js with standalone output
  • AI: Ollama with automated gemma3:4b model download
  • Database: SQLite with Docker volume persistence
  • Architecture: Multi-service with proper dependency management

File Structure:

β”œβ”€β”€ Dockerfile.backend          # FastAPI container (Python + uv)
β”œβ”€β”€ Dockerfile.frontend         # Next.js container (standalone build)
β”œβ”€β”€ docker-compose.yml          # Production environment
β”œβ”€β”€ docker-compose.dev.yml      # Development environment
β”œβ”€β”€ .dockerignore              # Build context optimization
└── DOCKER.md                  # Setup and usage documentation

Breaking Changes:

  • None - this is purely additive functionality
  • Existing local development workflow remains unchanged

Resource Requirements:

  • Minimum: 8GB RAM, 10GB disk space
  • Recommended: 16GB RAM, 20GB disk space
  • First run: Additional time for AI model download (~4GB)

Summary by CodeRabbit

  • New Features

    • Added comprehensive Docker support for both production and development, including Dockerfiles, Compose files, and a .dockerignore for optimized builds.
    • Introduced health checks and persistent storage for backend, frontend, and AI services.
    • Enhanced environment configuration and service orchestration for reliable startup and scaling.
    • Provided new npm scripts for streamlined Docker management.
  • Documentation

    • Added a detailed Docker setup and troubleshooting guide.
    • Updated the README with Docker installation instructions and references to further documentation.
  • Configuration

    • Improved frontend configuration for dynamic API URL handling and standalone output.

umidjon-2231 avatar Jul 24 '25 09:07 umidjon-2231

Walkthrough

A comprehensive Docker setup was introduced for the Resume Matcher project. This includes new Dockerfiles for backend and frontend, production and development Docker Compose files, a detailed Docker setup guide, and an extensive .dockerignore. The frontend Next.js configuration and project scripts were updated to support Dockerized workflows, with new documentation in the README.

Changes

File(s) Change Summary
.dockerignore Added file listing patterns to exclude unnecessary files from Docker build context.
DOCKER.md New documentation detailing Docker setup, configuration, troubleshooting, performance, security, and scaling.
Dockerfile.backend, Dockerfile.frontend Added Dockerfiles for backend (Python) and frontend (Node.js/Next.js) with multi-stage builds and health checks.
docker-compose.yml, docker-compose.dev.yml Added Compose files defining services, dependencies, volumes, health checks for production and development.
README.md Enhanced with Docker installation instructions, quick start, and references to new Docker documentation.
apps/frontend/next.config.ts Updated Next.js config: added output: 'standalone' and dynamic API URL rewrites for Docker compatibility.
package.json Added npm scripts for Docker build, up, down, dev, logs, and clean commands.

Sequence Diagram(s)

sequenceDiagram
    participant Dev as Developer
    participant Docker as Docker Engine
    participant Compose as Docker Compose
    participant Ollama as Ollama AI Service
    participant Backend as Backend Service
    participant Frontend as Frontend Service

    Dev->>Docker: Build images (backend, frontend)
    Dev->>Compose: Start services (docker-compose up)
    Compose->>Ollama: Start Ollama container
    Compose->>Ollama: Health check /api/tags
    Compose->>Ollama-init: Wait for Ollama healthy, pull model
    Compose->>Backend: Start after Ollama & Ollama-init ready
    Compose->>Backend: Health check /health
    Compose->>Frontend: Start after Backend healthy
    Compose->>Frontend: Health check /
    Dev->>Frontend: Access web app (port 3000)
    Frontend->>Backend: API requests (rewritten via API_URL)
    Backend->>Ollama: AI model requests

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

Docker ships upon the sea,
Compose and files in harmony.
Backend, frontend, Ollama too,
All set sail with something new!
Scripts and docs now lead the wayβ€”
Rabbit’s paws are here to say:
"Hop aboard, deploy today!" πŸ‡πŸš’

[!NOTE]

⚑️ Unit Test Generation is now available in beta!

Learn more here, or try it out under "Finishing Touches" below.

✨ Finishing Touches
πŸ§ͺ Generate unit tests
  • [ ] Create PR with unit tests
  • [ ] Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❀️ Share
πŸͺ§ Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

coderabbitai[bot] avatar Jul 24 '25 09:07 coderabbitai[bot]

Adding a relevant issue here

vaddisrinivas avatar Jul 25 '25 02:07 vaddisrinivas

Adding a relevant issue here

Thanks

umidjon-2231 avatar Jul 25 '25 02:07 umidjon-2231

Users may use their existing Ollama instance. It may be better to just add a env variable specifying the ollama endpoint.

arsaboo avatar Jul 25 '25 03:07 arsaboo

Users may use their existing Ollama instance. It may be better to just add a env variable specifying the ollama endpoint.

βž•1 on this

vaddisrinivas avatar Jul 25 '25 03:07 vaddisrinivas

Good point thanks. I will fix it

vaddisrinivas left a comment (srbhr/Resume-Matcher#436) https://github.com/srbhr/Resume-Matcher/pull/436#issuecomment-3116235745

Users may use their existing Ollama instance. It may be better to just add a env variable specifying the ollama endpoint.

βž•1 on this

β€” Reply to this email directly, view it on GitHub https://github.com/srbhr/Resume-Matcher/pull/436#issuecomment-3116235745, or unsubscribe https://github.com/notifications/unsubscribe-auth/AT7P7SNFXSQNJ4M2KWFUJPT3KGNV7AVCNFSM6AAAAACCIKJDVCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTCMJWGIZTKNZUGU . You are receiving this because you were mentioned.Message ID: @.***>

umidjon-2231 avatar Jul 25 '25 04:07 umidjon-2231

Having to provide your own ollama instance rather than having to spin up another ollama container would be nice.

smit-io avatar Jul 25 '25 16:07 smit-io

Having to provide your own ollama instance rather than having to spin up another ollama container would be nice.

Sure i am working on that

umidjon-2231 avatar Jul 25 '25 16:07 umidjon-2231

LGTM

mahimairaja avatar Jul 25 '25 17:07 mahimairaja

Followed blindly your instruction, I have an error with the backend not connecting to database. Here the logs pastebin.com/raw/BuE83b0X

@umidjon-2231

Noobzik avatar Jul 31 '25 20:07 Noobzik

Followed blindly your instruction, I have an error with the backend not connecting to database. Here the logs pastebin.com/raw/BuE83b0X

@umidjon-2231

Thanks for the report. I will fix it soon. I guess the problem is SQLite file is not created.

umidjon-2231 avatar Jul 31 '25 20:07 umidjon-2231

@umidjon-2231 are you still working on the issue? can I give my hands

mahimairaja avatar Aug 13 '25 01:08 mahimairaja

@umidjon-2231 are you still working on the issue? can I give my hands

Hi @mahimairaja , Yes, I’m still working on it, but help is welcome. Let me know what part you’d like to take on so we can coordinate.

umidjon-2231 avatar Aug 14 '25 02:08 umidjon-2231

@umidjon-2231 the issue is sql file not created because: As no local sqlite file exists initially , so app.db is mounted as directory by docker. Fixing it my binding parent directory to docker or by creating a separate db storage directory.

harmeetsingh-work avatar Oct 09 '25 05:10 harmeetsingh-work