feat(metrics): collect the DOCKER_HOST environment variable path
Which issue(s) does this change fix?
Why is this change necessary?
It allows SAM CLI to have a proxy measurement on which tools users are using as its backend. This allows the SAM CLI team to prioritize testing and features from various tools which users may be using to provide their DOCKER_HOST.
How does it address the issue?
This change collects the last part of URIs/paths, in hope that these last parts shed light on which tool the user is using with SAM CLI. For example, if a user is using a local version of docker, this new metric would collect docker.sock. This is only checking the last part of paths in an effort to not needlessly collect what may be private URLs to remote docker hosts which user's might be using.
What side effects does this change have?
None?
Mandatory Checklist
PRs will only be reviewed after checklist is complete
- [N/A] Add input/output type hints to new functions/methods
- [N/A] Write design document if needed (Do I need to write a design document?)
- [N/A] Write/update unit tests
- [N/A] Write/update integration tests
- [N.A] Write/update functional tests if needed
- [x]
make prpasses - [N/A]
make update-reproducible-reqsif dependencies were changed - [N/A] Write documentation
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Thanks for the contribution! There are a few errors on make pr (and in other tests), if you can take a look.
At least in make pr
samcli/lib/telemetry/metric.py:495: error: Missing positional argument "path" in call to "exists" [call-arg]
samcli/lib/telemetry/metric.py:497: error: Returning Any from function declared to return "str" [no-any-return]
samcli/lib/telemetry/metric.py:500: error: Returning Any from function declared to return "str" [no-any-return]
The first one is because of if os.path.exists(): (I imagine there's a variable missing there, probably parsed.path)
The other ones might be just errors on typing with os.path.basepath 🤔. You might be able to just do str(..) to fix those.
@valerena, is the failing Windows smoke test related to this change? If so, I'm not sure how and could use some guidance on how to fix it
There are some failing tests (related to logs cli command), although they don't seem to be related to the changes on this PR.
This PR introduce new metric dimension to data. We need to update the data schema first before merging this PR.
Ran pytest -vv tests/integration/telemetry and confirm success:
pytest -vv tests/integration/telemetry --tb=short
⋮
↳ Purpose: Run all integration telemetry tests
============================= test session starts ==============================
platform darwin -- Python 3.11.12, pytest-8.4.1, pluggy-1.5.0 -- /Users/vichym/.pyenv/versions/3.11.12/envs/aws-aws-sam-cli-env-3.11/bin/python
cachedir: .pytest_cache
hypothesis profile 'default'
Test order randomisation NOT enabled. Enable with --random-order or --random-order-bucket=<bucket_type>
...
tests/integration/telemetry/test_telemetry_contract.py::TestTelemetryContract::test_must_not_send_metrics_if_disabled_using_envvar
-------------------------------- live log call ---------------------------------
INFO werkzeug:_internal.py:97 127.0.0.1 - - [27/Aug/2025 23:58:52] "POST /metrics HTTP/1.1" 200 -
INFO werkzeug:_internal.py:97 127.0.0.1 - - [27/Aug/2025 23:58:52] "POST /metrics HTTP/1.1" 200 -
PASSED [ 92%]
tests/integration/telemetry/test_telemetry_contract.py::TestTelemetryContract::test_must_send_metrics_if_enabled_via_envvar
-------------------------------- live log call ---------------------------------
INFO werkzeug:_internal.py:97 127.0.0.1 - - [27/Aug/2025 23:58:56] "POST /metrics HTTP/1.1" 200 -
INFO werkzeug:_internal.py:97 127.0.0.1 - - [27/Aug/2025 23:58:56] "POST /metrics HTTP/1.1" 200 -
PASSED [100%]
======================== 12 passed, 2 skipped in 36.17s ========================
Thanks for the fixes, @vicheey. LGTM