Bot updates for maintenance tasks
Workflow to run maintenance tasks and bot-update, e.g.,
# WQPWQXRefTables.R
TADA_UpdateWQXCharValRef()
TADA_UpdateMeasureUnitRef()
TADA_UpdateDetCondRef()
TADA_UpdateDetLimitRef()
TADA_UpdateActivityTypeRef()
TADA_UpdateCharacteristicRef()
TADA_UpdateMeasureQualifierCodeRef()
TADA_UpdateMonLocTypeRef()
TADA_UpdateWQPOrgProviderRef()
Unknowns Are the source materials for each of these updated:
- (A) on a set schedule (e.g., file updated nightly), allowing these to be a scheduled task, or
- (B) intermittently (e.g., as data are added to WQX), meaning these would need to be triggered by a failed check (i.e., pre-commit)
Assuming A: How much oversight do we want on the commits (e.g., maintainer review each update as a PR or just let the bot PR to develop?)
Should updating the example data also be included in the list of scheduled maintenance tasks?
Should updating the example data also be included in the list of scheduled maintenance tasks?
Could - started with reference files because those seemed like they would have fewer things unknown to me. Is the example data used in doc-string examples, vignettes or elsewhere?
Should updating the example data also be included in the list of scheduled maintenance tasks?
Could - started with reference files because those seemed like they would have fewer things unknown to me. Is the example data used in doc-string examples, vignettes or elsewhere?
Yes, the example data should be updated as a scheduled maintenance task as well and are used in the vignettes. I have also considered if the example data and vignettes should be a separate repository or somehow not included in the package build eventually to be sent to CRAN. This would also reduce overall package size. We also have intermittent issues with the vignettes that run functions that pull from WQP or ATTAINS services which can be finicky.
Have a decent working example where the branch was behind develop (including files updated). You can see run1 where it did a commit to update files and run2 where there were no updates and therefore no commit.
If adding example data should that be as part of the same commit or split into it's own? We'll want to be sure that updated data gets tested in the vignettes (i.e., sometimes someone adds weird data and it breaks) before the commit is allowed to happen (won't if the workflow fails).
Currently, once merged, it would run on manual dispatch, scheduled everyday at 2:01AM, and on push/pull to develop. I've low confidence it will be able to commit to develop without us explicitly giving it some write permissions and maybe changing merge settings for bots exception.
If adding example data should that be as part of the same commit or split into it's own? We'll want to be sure that updated data gets tested in the vignettes (i.e., sometimes someone adds weird data and it breaks) before the commit is allowed to happen (won't if the workflow fails).
IMHO it should be a separate commit and maybe have a separate trigger (e.g., part of building docs/vignettes). The trigger (on) is at the workflow so that would mean another file to maintain.
Latest (run) will update example data after reference file updates and commit separately from the same workflow.
Suggest Data_R5_TADAPackageDemo() should be tucked somewhere as an internal function like the others so it can be maintained separately from the workflow. Any suggestions where?