feat: HiFa v1.1.0 schema development
Pull Request Description
The following is happening:
- resolves #1980
- start a HiFa JSON v1.1.0, but not release it.
- allow for defining a default schema version for the different schemas for
pyhfto support in the current implementation - provide a utility to migrate (upgrade/downgrade) provided specifications (workspace, patchset) to a different version
- provide command-line interface for upgrading/downgrading
- flesh out the type-hints for
pyhf.schema
To-Do:
- agree on how upgrade/downgrade works internally: separate classes, or single class like suggested by @matthewfeickert in comments, or more like functional form similar to what
nbconvertdoes (see my comment below https://github.com/scikit-hep/pyhf/pull/1978#issuecomment-1481459551) - determine whether we allow workspace/model and patchset to have different versions or not, or require them to always be synced (this changes what we do for
pyhf.schema.versionstopyhf.schema.versiondepending...)
Schema updates:
- [x] Update from Draft 6 to 2020-12 Draft instead
- [ ] Support serialization of custom function (another PR)
- [ ] Support serialization of multiple POIs (another PR)
- [ ] Support serialization of bin-wise fixed/constant parameters (another PR)
- [ ] Support HS3's HistFactory schema (another PR)
Checklist Before Requesting Reviewer
- [ ] Tests are passing
- [ ] "WIP" removed from the title of the pull request
- [ ] Selected an Assignee for the PR to be responsible for the log summary
Before Merging
For the PR Assignees:
- [ ] Summarize commit messages into a comprehensive review of the PR
@kratsg you have this slated for v0.7.1 at the moment. Does this mean that you feel we have a problem that needs to be patched? If not, we should probably place this in v0.8.0.
@kratsg you have this slated for
v0.7.1at the moment. Does this mean that you feel we have a problem that needs to be patched? If not, we should probably place this inv0.8.0.
It's going into the next release though. I don't know if we want to start juggling development HiFa v1.0.0 and v1.1.0 at the same time? But this will need to go in, so that we can deal with multiple schema versions.
It's going into the next release though. I don't know if we want to start juggling development HiFa v1.0.0 and v1.1.0 at the same time? But this will need to go in, so that we can deal with multiple schema versions.
Okay, then I've switched it to be v0.8.0. After the issues with getting v0.7.0 out and also accepting PRs I think it makes sense (for me to figure out how to do release branches) to start using patch releases for when we actually have a bug we need to go back and patch and start having other releases be minor (until we hit v1.0.0).
Codecov Report
Base: 98.30% // Head: 98.30% // No change to project coverage :thumbsup:
Coverage data is based on head (
2b168b7) compared to base (2b168b7). Patch has no changes to coverable lines.
:exclamation: Current head 2b168b7 differs from pull request most recent head 178b3db. Consider uploading reports for the commit 178b3db to get more accurate results
Additional details and impacted files
@@ Coverage Diff @@
## main #1978 +/- ##
=======================================
Coverage 98.30% 98.30%
=======================================
Files 69 69
Lines 4531 4531
Branches 645 645
=======================================
Hits 4454 4454
Misses 45 45
Partials 32 32
| Flag | Coverage Δ | |
|---|---|---|
| contrib | 97.88% <0.00%> (ø) |
|
| doctest | 61.15% <0.00%> (ø) |
|
| unittests-3.10 | 96.31% <0.00%> (ø) |
|
| unittests-3.8 | 96.33% <0.00%> (ø) |
|
| unittests-3.9 | 96.35% <0.00%> (ø) |
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
(Ignore the docs failing as this is Springer's fault: https://twitter.com/HEPfeickert/status/1600915112403304449?s=20&t=kjTW5TNbU_Jukxb0ALd0uQ. We can either fix this up locally in another PR or just wait for them to fix it.)