dataverse
dataverse copied to clipboard
Spike: Backstop the local Dataverse QA process
We will be augmenting the Dataverse QA process and efforts with Development cycles.
Proposed:
Requirements: (already part of the process)
- Definition of done is somewhere in the issue itself.
- Definition of how to test in the PR
For review: (already part of the process)
- Dev A develops the Code
- Dev B does the review
QA:
- QA person will update a stale branch from develop as long as there is no conflicts
- No one should QA their own pull requests
- Testing will be done on "Dataverse internal" environment
- Will be shared between the dev team.
- Start with each person on the team rotating through the QA.
- To start each person on the team with just a single issue each
- Leonid will go first. Then we will rotate around.
- First roster:
- Leonid
Environment:
- Assume that we are not changing the QA environment.
- Assume Dev's are familiar with the QA development environment.
Leonid
- Volunteers to keep the internal environment operating as needed
- volunteers to keep the perf cluster operating as needed.
Follow on discussions
Follow-on
- Automated tests
- Get a start prior to Kevin getting back.
What does kevin does that does not happen during review.
- The types of things that won't show up in the definition of test for a particular feature.
- Can we get a checklist for people who are standing in?
What about having the community help with the regression testing?
- Reach out to the community and ask for help.
Jim Meyers raised:
- Globus and 3B PRs - with Kevin out - how do we get someone else credentials to test with the other team? Integration testing requires extra VPN access. Tom is the right contact for this
For releases
- The process is documented.
- There are some specifics around python scripts
- There is performance testing that needs to be done.
The catches will be the smaller - details - that are easy for the team to miss. These details are covered in the document.
R. has a Github project with the performance testing in it.
- Kevin will send/document the performance script run.
- Kevin mentioned that R's test has additional functionality built in that are not currently run. (extra credit) -
Phil mentioned - this performance testing could become part of an automated test.
From Gustavo
- There will be issues that are ready for QA.
- Devs will potentially pick up something from QA prior to looking for new work from this sprint.
- i.e. - There is no fixed rotation of who works on QA issues.