cli
cli copied to clipboard
Test results plugin
As a user, I want to be able to integrate my test suites into Code PushUp and aggregate the results in my report.
User story examples
- A company has multiple tools running in their pipelines and own environment. They can provide us with results artifacts.
- A smaller project does not have CI setup and runs their unit tests locally before commit.
- A chaotic project has issues with test stability. The tests have side-effects, errors are expected.
- A company already has a plugin/dashboard in place. They are used to a certain expectations regarding test information when trying out Code PushUp (e.g. test trends).
Research
- [x] Research has been done on test results formats that need to be supported (
JUnit
,XUnit
,NUnit
, ...). - [ ] Design additional
details
properties required for this plugin to contain all necessary information- [ ] (can be a separate issue) Research has been done on how to set up a blob storage for videos or screenshots.
- [ ] A better way of displaying additional information:
- Error message format that does not just display raw text with full call stack.
- Test suite name, test case name are provided separately.
Acceptance criteria
- [ ] There is one plugin that aggregates all tools and their test results.
- [ ] At least one testing tool is supported in the MVP version (Vitest/Jest/Cypress).
- [ ] An audit is a specific test result from a given tool (successor of test results format spike).
- [ ] An issue is a failing test with information about the test (name, file, error message, screenshot?).
- [ ] The score and value refer to the amount of passing tests (range 0-1 and 0-100, respectively).
- [ ] This plugin is fully tested.
- [ ] This plugin has sufficient documentation for both configuration and expansion.
Implementation details
- Base flow:
run_tests?(command)
→parse_results(format
) →populate_summary(results)
→populate_issues(issue_config)
→auditOutput
- [ ] Decide on the plugin hierarchy approach:
- Either there is a group for every testing tool (which may offer different types of testing). For example Cypress group would contain component and E2E test audits. Jest group would contain unit and integration test audits. There would be a separate user story per group.
- Or there is one generic audit with configuration for every tool (but then we would still need to somehow insert the icon, documentation and descriptions that are relevant only for that tool and have a set of supported ones).
- 🗨️ I think we should create building blocks which every new tool would capitalise on - parsing common formats, parsing results into issues, different additional information. Creating a new group/audit would then not be as difficult.
- [ ] Decide on whether (or when) we should offer to run the testing tool:
- Pros: For unit tests it could be useful (fast results).
- Cons:
- For E2E tests we could be running it for hours.
- We would need to handle orchestration (behaviour on error during tests/parallelism/environment/cross-platform) ourselves. Even if we just triggered their pipelines in CI, we would have to wait for it. ⛔
Known issues
- If an audit is a test result of a full test suite, we are not able to track a trend for a given test scenario (unless we expand issues view).