lock_code_manager
lock_code_manager copied to clipboard
Fix startup code flapping issue
Summary
- Prevents startup code flapping and reduces network churn by gating syncs until data is ready and locks are reachable.
- Removes Tenacity dependency and uses internal connection checks + rate limiting so offline locks don't spam retries.
- Adds retry scheduling + availability guards for in-sync entities, limits multi-slot startup sync to one call per slot, aligns Lovelace resource handling with single global registration, and cleans Lovelace resource assertions.
Root Cause
- In-sync entity could evaluate before coordinator/dep states were ready, causing redundant set/clear operations at startup and leaving out-of-sync state when locks were offline.
- Tenacity-based retries were unnecessary after adding provider-level rate limiting and connection checks.
- Resource handling needed a YAML-mode safe path and to remain global (once registered works for all entries).
Changes
- Keep initial-load guard: in-sync entity sets state once dependencies exist but skips operations until after first load.
- Ignore irrelevant/unavailable state-change events; wait for dependent entities (PIN/name/active/code) and coordinator data before syncing.
- On
LockDisconnected, schedule a 10s retry viaasync_call_later; cancel pending retries on success; keeplast_update_successgating to avoid spinning when offline. BaseLocknow performs explicit connection checks before any get/set/clear/refresh, still serializes operations with 2s spacing, and no longer depends on Tenacity (removed from manifest/requirements). Connection failures do not advance the rate-limit timer.- Lovelace resource handling stays global: register strategy once, remember if we auto-registered, and skip removal when HA is in YAML mode (mirrors ha_scrypted safeguards).
- Tests updated/added for missing PIN state gating, entity availability, retry-on-reconnect for set/clear, one-time per-slot startup sync, rate-limit timing after connection failures, YAML-mode resource unload guard, and Lovelace resource assertions now check resources instead of logs.
Test Plan
- [x]
source venv/bin/activate && pytest tests/test_binary_sensor.py -q - [x]
source venv/bin/activate && pytest tests/_base/test_provider.py -k disconnected -q - [x]
source venv/bin/activate && pytest tests/test_init.py -q - [x]
source venv/bin/activate && pytest tests/_base/test_provider.py -k connection_failure_does_not_rate_limit_next_operation -q - [x]
source venv/bin/activate && pytest tests/test_binary_sensor.py -k startup_out_of_sync_slots_sync_once -q
Related Issues
- Startup code flapping on load
- Sync drift when locks are temporarily offline
- Network churn from redundant sync calls
- Lovelace resource cleanup safety when HA uses YAML resources
Codecov Report
:x: Patch coverage is 95.30201% with 7 lines in your changes missing coverage. Please review.
:white_check_mark: Project coverage is 91.07%. Comparing base (f0a525c) to head (50fa443).
:warning: Report is 41 commits behind head on main.
Additional details and impacted files
@@ Coverage Diff @@
## main #532 +/- ##
==========================================
+ Coverage 90.39% 91.07% +0.68%
==========================================
Files 20 20
Lines 1280 1367 +87
==========================================
+ Hits 1157 1245 +88
+ Misses 123 122 -1
| Files with missing lines | Coverage Δ | |
|---|---|---|
| custom_components/lock_code_manager/config_flow.py | 100.00% <100.00%> (ø) |
|
| custom_components/lock_code_manager/coordinator.py | 100.00% <100.00%> (+13.04%) |
:arrow_up: |
| ...components/lock_code_manager/providers/zwave_js.py | 36.93% <ø> (ø) |
|
| ...om_components/lock_code_manager/providers/_base.py | 85.14% <97.22%> (+2.36%) |
:arrow_up: |
| custom_components/lock_code_manager/__init__.py | 95.45% <25.00%> (-1.31%) |
:arrow_down: |
| ...stom_components/lock_code_manager/binary_sensor.py | 98.57% <97.14%> (+1.25%) |
:arrow_up: |
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
- :package: JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.
@copilot review
do you need any testers? 👀
do you need any testers? 👀
sorry i missed this but yes! you dont have to ask for permission to test things like this haha hopefully I can get back to a place where I have the capacity to do that myself, but I just haven't had time :(