ScubaGear
ScubaGear copied to clipboard
Mapping product names
๐ฃ Description
Added product names to the table with with tenant licensing information Closes #993
๐ญ Motivation and context
This helps users understand which licenses they have
๐งช Testing
Tested with the different tenants
โ Pre-approval checklist
- [x] This PR has an informative and human-readable title.
- [x] PR targets the correct parent branch (e.g., main or release-name) for merge.
- [x] Changes are limited to a single goal - eschew scope creep!
- [x] Changes are sized such that they do not touch excessive number of files.
- [x] All future TODOs are captured in issues, which are referenced in code comments.
- [x] These code changes follow the ScubaGear content style guide.
- [x] Related issues these changes resolve are linked preferably via closing keywords.
- [x] All relevant type-of-change labels added.
- [x] All relevant project fields are set.
- [x] All relevant repo and/or project documentation updated to reflect these changes.
- [x] Unit tests added/updated to cover PowerShell and Rego changes.
- [x] Functional tests added/updated to cover PowerShell and Rego changes.
- [x] All relevant functional tests passed.
- [x] All automated checks (e.g., linting, static analysis, unit/smoke tests) passed.
โ Pre-merge checklist
-
[x] PR passed smoke test check.
-
[ ] Feature branch has been rebased against changes from parent branch, as needed
Use
Rebase branchbutton below or use this reference to rebase from the command line. -
[ ] Resolved all merge conflicts on branch
-
[ ] Notified merge coordinator that PR is ready for merge via comment mention
โ Post-merge checklist
- [ ] Feature branch deleted after merge to clean up repository.
- [ ] Verified that all checks pass on parent branch (e.g., main or release-name) after merge.
I am unsure where to put the product names csv file, currently it is in the ScubaGear folder, if it needs to be moved to a different location please reply to this comment.
I am guessing that Microsoft updates the licensing plan data regularly. What is the plan to keep this information up to date in ScubaGear? Right now the new code is referencing a static CSV file.
I am guessing that Microsoft updates the licensing plan data regularly. What is the plan to keep this information up to date in ScubaGear? Right now the new code is referencing a static CSV file.
Given the direct copy is used, we can likely generalize David Bui's github action to check for updates to this file as well Probably just be comparing hashes for changes and updating if so, for example?
Suggest we file an issue for that after this is committed.
Separately, please ping when the test cases have been updated and I will re-review for approval.
I am guessing that Microsoft updates the licensing plan data regularly. What is the plan to keep this information up to date in ScubaGear? Right now the new code is referencing a static CSV file.
Given the direct copy is used, we can likely generalize David Bui's github action to check for updates to this file as well Probably just be comparing hashes for changes and updating if so, for example?
Suggest we file an issue for that after this is committed.
Separately, please ping when the test cases have been updated and I will re-review for approval.
Github automation is a good idea. My other thought was to add a new cmdlet to the tool that would reach out and grab the new version of the file directly and incorporate it into ScubaGear. That way, users could update the file directly outside the release cycle when needed if they had the necessary connectivity to do so. So I'd say, how about both as TODOs added beyond this issue?
I am guessing that Microsoft updates the licensing plan data regularly. What is the plan to keep this information up to date in ScubaGear? Right now the new code is referencing a static CSV file.
Given the direct copy is used, we can likely generalize David Bui's github action to check for updates to this file as well Probably just be comparing hashes for changes and updating if so, for example? Suggest we file an issue for that after this is committed. Separately, please ping when the test cases have been updated and I will re-review for approval.
Github automation is a good idea. My other thought was to add a new cmdlet to the tool that would reach out and grab the new version of the file directly and incorporate it into ScubaGear. That way, users could update the file directly outside the release cycle when needed if they had the necessary connectivity to do so. So I'd say, how about both as TODOs added beyond this issue?
2 comments/questions about this.
- Currently the CSV File is not a direct copy of the CSV file from Microsoft, because the original file from Microsoft was about 1 MB and in an effort to reduce the size, I had removed the last few columns which we didn't use from the CSV file, but if size is not a concern, we can use a direct copy instead.
- If we use a Github automation, couldn't we set it up so that it runs nightly, similar to the Nightly Testing that is run through a Github automation so there is always the newest version of the file and it wouldn't be tied to the release cycle.
I am guessing that Microsoft updates the licensing plan data regularly. What is the plan to keep this information up to date in ScubaGear? Right now the new code is referencing a static CSV file.
Given the direct copy is used, we can likely generalize David Bui's github action to check for updates to this file as well Probably just be comparing hashes for changes and updating if so, for example? Suggest we file an issue for that after this is committed. Separately, please ping when the test cases have been updated and I will re-review for approval.
Github automation is a good idea. My other thought was to add a new cmdlet to the tool that would reach out and grab the new version of the file directly and incorporate it into ScubaGear. That way, users could update the file directly outside the release cycle when needed if they had the necessary connectivity to do so. So I'd say, how about both as TODOs added beyond this issue?
2 comments/questions about this.
- Currently the CSV File is not a direct copy of the CSV file from Microsoft, because the original file from Microsoft was about 1 MB and in an effort to reduce the size, I had removed the last few columns which we didn't use from the CSV file, but if size is not a concern, we can use a direct copy instead.
- If we use a Github automation, couldn't we set it up so that it runs nightly, similar to the Nightly Testing that is run through a Github automation so there is always the newest version of the file and it wouldn't be tied to the release cycle.
Ok, makes sense. I think its fine to drop columns we don't need so long as we do it in a standardized way (e.g. ImportFrom-Csv | Select-Object -Blah -Blah | ExportTo-Csv or the like in the automation script) and not a manual process.
Yes, if we do automation we can run it on a cron schedule and just update main periodically. That's how I would think to do it anyways.
@nanda-katikaneni Ready to merge!
