jwst
jwst copied to clipboard
fix ModelContainer.get_crds_parameters
ModelContainer.get_crds_parameters
fails if called on a ModelContainer
not read from an association.
Examples of failures can be found in the errors in a test run:
https://github.com/spacetelescope/stpipe/actions/runs/8008830118/job/21876144742
where an attempt was made to update stpipe
to use ModelContainer.get_crds_parameters
.
This PR updates ModelContainer.get_crds_parameters
to not assume that an association was used to construct the ModelContainer
.
To demonstrate the change this will allow a draft PR was opened in stpipe
that switches it to use ModelContainer.get_crds_parameters
: https://github.com/spacetelescope/stpipe/pull/144
Regtests show 1 error: https://plwishmaster.stsci.edu:8081/job/RT/job/JWST-Developers-Pull-Requests/1230/
jwst.associations.tests.test_level3_duplicate.[stable-deps] test_duplicate_names
failed due to a missing log message. This commonly fails during regression runs likely due to stpipe
handling of loggers. Note that this is not a regtest and passed in the CI runs.
Checklist for maintainers
- [ ] added entry in
CHANGES.rst
within the relevant release section - [ ] updated or added relevant tests
- [ ] updated relevant documentation
- [ ] added relevant milestone
- [ ] added relevant label(s)
- [ ] ran regression tests, post a link to the Jenkins job below. How to run regression tests on a PR
- [ ] Make sure the JIRA ticket is resolved properly
Codecov Report
Attention: Patch coverage is 0%
with 1 lines
in your changes are missing coverage. Please review.
Project coverage is 75.22%. Comparing base (
4cc0ac1
) to head (40a279c
). Report is 25 commits behind head on master.
:exclamation: Current head 40a279c differs from pull request most recent head 12d058e. Consider uploading reports for the commit 12d058e to get more accurate results
Files | Patch % | Lines |
---|---|---|
jwst/datamodels/container.py | 0.00% | 1 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## master #8310 +/- ##
==========================================
+ Coverage 75.15% 75.22% +0.07%
==========================================
Files 470 474 +4
Lines 38604 38803 +199
==========================================
+ Hits 29014 29191 +177
- Misses 9590 9612 +22
Flag | Coverage Δ | *Carryforward flag | |
---|---|---|---|
nightly | 77.38% <ø> (-0.01%) |
:arrow_down: | Carriedforward from 4cc0ac1 |
*This pull request uses carry forward flags. Click here to find out more.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@nden @jdavies-st Any final comments or suggestions before we merge this?
I still think this is probably a bug in the current pipeline but we don't see it because the Level 3 pipeline opens an association by passing asn_exptypes = ['science']
. It's also probably true that the "interesting" metadata that get_crds_parameters
provides are in most cases identical between different association exptype.
@hbushouse Is there a use case where a non-science exposure in an association file will have different metadata from the science exposure? And by "metadata" the only interest is in those keywords used as selectors in CRDS.
If the input to ModelContainer is not an association, opening the first model should be fine.
I would imagine a level 2 association with target acquisitions would qualify - perhaps also coronagraphic associations? Not sure if the background, science and PSF exposures have differing exp_types, but they certainly might.
After some discussion with @nden I'd like to delay this change until after the release (and will make this PR draft). The main motivation is to address the above comments and changeget_crds_parameters
to use the first science exposure type. Some updates that may be added are:
- modifying
get_crds_parameters
to use exposure type when loaded from an association backed and hopefully from a model list - modifying steps that explicitly use the first index model to instead select by exposure type
- tests for the above behavior including associations where the first listed model is not science and has different metadata
The above will expand the scope of this PR and I'll bring it back out of draft and re-request reviews when it's ready.
Thanks all for the discussion and comments.
@nden and @tapastro having example associations with models with varying metadata would be excellent! Please feel free to send me links/files.