N8N integration
What does this PR do?
Motivation
Review checklist (to be filled by reviewers)
- [ ] Feature or bugfix MUST have appropriate tests (unit, integration, e2e)
- [ ] Add the
qa/skip-qalabel if the PR doesn't need to be tested during QA. - [ ] If you need to backport this PR to another branch, you can add the
backport/<branch-name>label to the PR and it will automatically open a backport PR once this one is merged
Codecov Report
:x: Patch coverage is 82.00000% with 18 lines in your changes missing coverage. Please review.
:white_check_mark: Project coverage is 89.02%. Comparing base (9e039f7) to head (127f8c5).
:warning: Report is 4 commits behind head on master.
Additional details and impacted files
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
- :package: JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.
⚠️ The qa/skip-qa label has been added with shippable changes
The following files, which will be shipped with the agent, were modified in this PR and
the qa/skip-qa label has been added.
You can ignore this if you are sure the changes in this PR do not require QA. Otherwise, consider removing the label.
List of modified files that will be shipped with the agent
n8n/datadog_checks/n8n/__about__.py
n8n/datadog_checks/n8n/__init__.py
n8n/datadog_checks/n8n/check.py
n8n/datadog_checks/n8n/config_models/__init__.py
n8n/datadog_checks/n8n/config_models/defaults.py
n8n/datadog_checks/n8n/config_models/instance.py
n8n/datadog_checks/n8n/config_models/shared.py
n8n/datadog_checks/n8n/config_models/validators.py
n8n/datadog_checks/n8n/data/conf.yaml.example
n8n/datadog_checks/n8n/metrics.py
n8n/changelog.d/21835.added
n8n/pyproject.toml
n8n/hatch.toml
So a few things here that we should clean up:
- Counter metrics on the raw side we'll need to remove the
_totalotherwise it won't match - On the datadog side, it will come appended with the
.countsuffix. We should update that in the metadata. - Your current unit test has a e2e test in it. It uses the
dd_agent_checkfixture that doesn't work in unit tests. Chances are it's being skipped. - Your e2e doesn't assert metrics. The e2e looking test in the test_unit won't run because we derive the test from the filename so it won't look in that file for a e2e test.
The combination of point 3 and 4 is causing the tests to pass. I think you should either assert metrics as a unit test and mock the http response or you move metric checking to the e2e test.