HA-tion icon indicating copy to clipboard operation
HA-tion copied to clipboard

feat: Set up comprehensive Python testing infrastructure

Open llbbl opened this issue 3 months ago • 0 comments

Set up Python Testing Infrastructure

Summary

This PR establishes a complete testing infrastructure for the Home Assistant Tion BTLE custom component project using Poetry as the package manager.

Changes Made

  • Package Management: Configured Poetry with pyproject.toml including project metadata and dependencies

  • Testing Dependencies: Added core testing packages:

    • pytest (^8.0.0) - Main testing framework
    • pytest-cov (^4.0.0) - Coverage reporting
    • pytest-mock (^3.12.0) - Mocking utilities
    • pytest-asyncio (^0.23.0) - Async testing support
  • Testing Configuration: Comprehensive pytest configuration in pyproject.toml:

    • Test discovery patterns for files, classes, and functions
    • Coverage settings with 80% threshold requirement
    • HTML and XML coverage report generation
    • Custom test markers: unit, integration, slow
    • Async testing mode set to auto
  • Directory Structure:

    tests/
    ├── __init__.py
    ├── conftest.py          # Shared fixtures
    ├── test_validation.py   # Infrastructure validation tests
    ├── unit/
    │   └── __init__.py
    └── integration/
        └── __init__.py
    
  • Shared Fixtures: Created comprehensive fixtures in conftest.py:

    • temp_dir - Temporary directory for test files
    • mock_config_entry - Mock Home Assistant config entry
    • mock_hass - Mock Home Assistant instance
    • mock_ble_device - Mock Bluetooth device
    • mock_tion_device - Mock Tion device with realistic responses
    • sample_device_data - Realistic test data
  • Development Scripts: Poetry commands for easy testing:

    • poetry run test - Run all tests
    • poetry run tests - Alternative command (both work identically)
  • Git Configuration: Updated .gitignore with:

    • Testing artifacts (.pytest_cache/, .coverage, htmlcov/, coverage.xml)
    • Claude Code settings (.claude/*)
    • Python build artifacts and virtual environments
    • IDE and OS specific files

Instructions for Running Tests

  1. Install dependencies:

    poetry install
    
  2. Run all tests:

    poetry run test
    # or
    poetry run tests
    
  3. Run specific test categories:

    poetry run pytest -m unit          # Unit tests only
    poetry run pytest -m integration   # Integration tests only
    poetry run pytest -m "not slow"    # Exclude slow tests
    
  4. Generate coverage reports:

    poetry run pytest --cov-report=html  # HTML report in htmlcov/
    poetry run pytest --cov-report=xml   # XML report for CI
    
  5. Run with different verbosity:

    poetry run pytest -v     # Verbose
    poetry run pytest -vv    # Very verbose
    poetry run pytest -q     # Quiet
    

Validation

The setup includes validation tests (tests/test_validation.py) that verify:

  • ✅ Project structure exists correctly
  • ✅ Basic imports work (where dependencies are available)
  • ✅ Pytest configuration is working
  • ✅ Custom test markers function properly
  • ✅ Shared fixtures are accessible
  • ✅ Async testing capabilities work
  • ✅ Coverage reporting generates properly

Test Results: All 9 validation tests pass, confirming the testing infrastructure is ready for use.

Configuration Choices

  • Poetry over pip: Chosen for better dependency management and lock file support
  • Coverage threshold: Set to 80% to ensure good test coverage without being overly restrictive
  • Test markers: Added unit, integration, and slow for flexible test categorization
  • Async support: Enabled automatic async test handling for Home Assistant's async nature
  • Multiple output formats: Coverage reports in both HTML (development) and XML (CI) formats

Next Steps

Developers can now:

  1. Write unit tests in tests/unit/
  2. Write integration tests in tests/integration/
  3. Use the provided fixtures for consistent test setup
  4. Run tests locally with immediate feedback
  5. Generate coverage reports to identify untested code

The testing infrastructure is production-ready and follows Python testing best practices.

llbbl avatar Sep 03 '25 23:09 llbbl