jwst
jwst copied to clipboard
outlier_detection blotting needs to properly scale based on input and output pixel scales
Issue JP-794 was created by James Davies:
With
https://github.com/spacetelescope/jwst/pull/3651
The scaling between the median image and the blotted image is set to 1, which is the default. But ideally it should scale based on the pixel scale (CDELTn) of the input and output image, in case a user specifies a smaller pixel scale for median.
Spectral data no longer have CDELTn keywords, so a general function of computing the pixel scale for imaging and spectral data needs to be written and used here.
There are a couple other places in the pipeline where a pixel scale is computed, so perhaps some of that code can be generalized and used here. Ideally, we should have one function that computes this in a consistent way for all our data.
Possibly it could be a method on the GWCS object too?
Comment by Karl Gordon: Is this something that is straightforward to implement/test? If so, then I can look at doing this in the PR I'm working on for this step.
Original JP ticket has been withdrawn.