Maintenance: Add caching for build
Summary of the new feature / enhancement
Right now, the builds take 15-20m on average. This seems primarily due to the compiling and building of the various projects. If some of those builds and downloads could be cached - or parallelized - we could probably get the build down to 5-10m.
Proposed technical implementation details (optional)
I think we have the following opportunities for improving build times:
- [ ] Cache the downloads for external crates
- [ ] Cache the builds for internal crates and only rebuild if their code changed
- [ ] Cache the downloads for PowerShell modules
Possibly, we could:
-
[ ] Parallelize the builds for internal crates into groups that depend on each other:
--- title: Dependency Map --- flowchart LR dsc_lib --> dsc dsc_lib --> test_group_resource ntstatuserror --> ntreg ntstatuserror --> ntuserinfo ntstatuserror --> registry ntreg --> registry ntuserinfo --> ntreg osinfo pal --> registry process y2j--- title: Build Jobs --- flowchart LR osinfo process y2j dsc_lib ---> dsc dsc_lib ---> test_group_resource ntstatuserror --> ntreg+ntuserinfo ntreg+ntuserinfo --> registry pal ---> registry- Build job for
dsc_libis a dependency fordscandtest_group_resource - Build job for
ntstatuserroris a dependency for a job that builds bothntuserinfoandntreg - Build job for
palandntreg+ntuserinfoare dependencies forregistry osinfo,process, andy2jare independent builds- All build jobs are a dependency for acceptance tests
While this sounds very complicated, in practice we can use GitHub artifacts to pass builds between jobs and reuse prior builds if the code didn't change. There's some prior art we can use for this. I'd be willing to prototype the workflow.
That being said, I think we could definitely start with the initial caching and move on from there if the build times are still very high.
- Build job for