dnschef icon indicating copy to clipboard operation
dnschef copied to clipboard

feat: Set up Python testing infrastructure with Poetry and pytest

Open llbbl opened this issue 6 months ago • 0 comments

Set Up Python Testing Infrastructure

Summary

This PR establishes a complete testing infrastructure for the DNSChef project, migrating from Pipenv to Poetry for modern dependency management and setting up pytest as the testing framework.

Changes Made

Package Management

  • Migrated to Poetry: Created pyproject.toml with Poetry configuration
  • Preserved existing dependencies: Migrated dnslib from requirements.txt
  • Maintained compatibility: Kept existing Pipfile for backward compatibility

Testing Dependencies

Added the following development dependencies:

  • pytest (^7.4.0) - Core testing framework
  • pytest-cov (^4.1.0) - Coverage reporting
  • pytest-mock (^3.11.0) - Mocking utilities

Testing Configuration

Configured comprehensive pytest settings in pyproject.toml:

  • Test discovery: Configured patterns for finding test files
  • Coverage settings:
    • 80% coverage threshold
    • HTML and XML report generation
    • Branch coverage enabled
    • Excluded non-source files from coverage
  • Custom markers: Added unit, integration, and slow test markers
  • Strict mode: Enabled strict markers and configuration

Directory Structure

Created organized testing structure:

tests/
├── __init__.py
├── conftest.py          # Shared fixtures
├── test_setup_validation.py  # Infrastructure validation
├── unit/
│   └── __init__.py
└── integration/
    └── __init__.py

Shared Fixtures (conftest.py)

Created comprehensive fixtures for common testing needs:

  • temp_dir and temp_file - Temporary file handling
  • mock_config - DNS server configuration mocking
  • mock_dns_query - DNS query object mocking
  • mock_socket - Network socket mocking
  • mock_logger - Logging mocking
  • dns_server_config - Sample configurations
  • sample_dns_records - Test DNS records
  • reset_environment - Environment cleanup
  • capture_stdout - Output capturing
  • mock_args - Command-line argument mocking

Development Workflow

Updated .gitignore with:

  • Testing artifacts (.pytest_cache/, .coverage, htmlcov/, etc.)
  • Claude settings (.claude/*)
  • Virtual environments
  • IDE files
  • Build artifacts
  • Note: poetry.lock is intentionally gitignored as this is a tool, not an application

Validation

Created test_setup_validation.py to verify:

  • All testing dependencies are properly installed
  • Fixtures are available and functional
  • Custom markers work correctly
  • The main module can be imported
  • Coverage tracking is enabled

How to Use

Install Dependencies

poetry install

Run Tests

You can use either command:

poetry run test
poetry run tests

Run Specific Tests

poetry run pytest tests/unit/
poetry run pytest -m unit
poetry run pytest -m "not slow"

Generate Coverage Reports

Tests automatically generate coverage reports:

  • Terminal output with missing lines
  • HTML report in htmlcov/ directory
  • XML report as coverage.xml

Notes

  • The project requires Python 3.7+ due to testing dependency requirements
  • Coverage threshold is set to 80% - tests will fail if coverage drops below this
  • The infrastructure is ready for developers to start writing actual unit and integration tests
  • No actual unit tests for the codebase were written - only infrastructure setup and validation

Next Steps

Developers can now:

  1. Write unit tests in tests/unit/
  2. Write integration tests in tests/integration/
  3. Use the provided fixtures for common testing scenarios
  4. Run tests with coverage to ensure code quality

llbbl avatar Jun 24 '25 04:06 llbbl