jwst icon indicating copy to clipboard operation
jwst copied to clipboard

Superbias step does more than superbias subtraction

Open kammerje opened this issue 2 years ago • 8 comments

Data before and after the superbias subtraction stage 1 pipeline step is considered. The difference of this data should simply be the superbias itself. According to the JWST pipeline manual, the superbias is subtracted from every group and every integration. The expectation was that the superbias should be exactly the same frame for every group and every integration, so when subtracting e.g. the group 0 superbias from the group 1 superbias it should be zero everywhere. Instead, residuals of the PSF can be seen in that difference. Therefore, the superbias step is doing more than just the superbias subtraction. The screenshot below illustrates this issue. Screen Shot 2021-08-27 at 4 20 40 AM

kammerje avatar Aug 27 '21 08:08 kammerje

@stscijgbot-jp

hbushouse avatar Aug 27 '21 11:08 hbushouse

This issue is tracked on JIRA as JP-2265.

stscijgbot-jp avatar Aug 27 '21 11:08 stscijgbot-jp

Comment by Howard Bushouse on JIRA:

The superbias subtraction step in the pipeline is a trivial operation. 95% of the work is simply loading the science exposure and the superbias ref file from CRDS and doing some error checking on each. Once that's been done the entire algorithm comes down to 1 line of code: https://github.com/spacetelescope/jwst/blob/master/jwst/superbias/bias_sub.py#L80 where output.data is the 4-D SCI array of the science exposure to be corrected (a copy of the input science exposure) and bias.data is the 2-D SCI array of the superbias ref file. The operation uses built-in python array "broadcasting" to subtract the 2D superbias image from the 2D science image in every group and integration of the science exposure.

So I don't see how the algorithm itself - being so simple - could be producing unexpected results, at least in terms of over- or under-subtracting source signal. I would think this might be something related to the data contained in the superbias ref file itself and the way that it was processed to create the reference file.

stscijgbot-jp avatar Aug 30 '21 13:08 stscijgbot-jp

Comment by Howard Bushouse on JIRA:

Assigned to Alicia Canipe to perhaps discuss within the DMS WG to decide whether a change to the algorithm might be warranted.

stscijgbot-jp avatar Aug 30 '21 13:08 stscijgbot-jp

Comment by Alicia Canipe on JIRA:

Hmm... my first thought would be (assuming they're using simulations) that they're testing superbias using a different reference file than was used by Mirage to create the simulation. I think we've done pretty thorough checks of superbias, but I don't think I've checked recently because it's such a straightforward step (Bryan Hilbert have you?). I'll dig into it a little more.

 

stscijgbot-jp avatar Sep 01 '21 13:09 stscijgbot-jp

Comment by Alicia Canipe on JIRA:

I am not able to reproduce his results. I checked a Mirage image before and after superbias. I also checked the reference file. Then I got the difference array for data before and after superbias, and checked the difference images (all attached). I don't see the same feature.

!image-2021-09-01-09-39-24-929.png!!image-2021-09-01-09-40-02-598.png!!image-2021-09-01-09-40-15-577.png!!image-2021-09-01-09-40-48-123.png!

stscijgbot-jp avatar Sep 01 '21 13:09 stscijgbot-jp

Comment by Bryan Hilbert on JIRA:

I remember checking superbias subtraction long ago and not finding any problems. A mis-match between the bias used by Mirage to create the data and that in the superbias reference file is possible, in that Mirage works from a single dark current exposure when building the simulated data. So the bias level in that dark current exposure most likely has differences compared to the superbias reference file. But even with that, (group1_after_superbias_sub - group1_before_superbias_sub) - (group0_after_superbias_sub - group0_before_superbias_sub) should be zero in every pixel. Any differences in bias between the reference file and what Mirage uses should be subtracted out. 

stscijgbot-jp avatar Sep 01 '21 14:09 stscijgbot-jp

Comment by Alicia Canipe on JIRA:

I also thought I wrote a unit test for superbias that would catch a discrepancy like this if it popped up, but I didn't. I'll add a label to this ticket to remind myself to add this one. 

stscijgbot-jp avatar Sep 01 '21 14:09 stscijgbot-jp

This issue is tracked on JIRA as JP-2265.

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Comment by Howard Bushouse on JIRA:

The superbias subtraction step in the pipeline is a trivial operation. 95% of the work is simply loading the science exposure and the superbias ref file from CRDS and doing some error checking on each. Once that's been done the entire algorithm comes down to 1 line of code: https://github.com/spacetelescope/jwst/blob/master/jwst/superbias/bias_sub.py#L80 where output.data is the 4-D SCI array of the science exposure to be corrected (a copy of the input science exposure) and bias.data is the 2-D SCI array of the superbias ref file. The operation uses built-in python array "broadcasting" to subtract the 2D superbias image from the 2D science image in every group and integration of the science exposure.

So I don't see how the algorithm itself - being so simple - could be producing unexpected results, at least in terms of over- or under-subtracting source signal. I would think this might be something related to the data contained in the superbias ref file itself and the way that it was processed to create the reference file.

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Comment by Howard Bushouse on JIRA:

Assigned to Alicia Canipe to perhaps discuss within the DMS WG to decide whether a change to the algorithm might be warranted.

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Comment by Alicia Canipe on JIRA:

Hmm... my first thought would be (assuming they're using simulations) that they're testing superbias using a different reference file than was used by Mirage to create the simulation. I think we've done pretty thorough checks of superbias, but I don't think I've checked recently because it's such a straightforward step (Bryan Hilbert have you?). I'll dig into it a little more.

 

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Comment by Alicia Canipe on JIRA:

I am not able to reproduce his results. I checked a Mirage image before and after superbias. I also checked the reference file. Then I got the difference array for data before and after superbias, and checked the difference images (all attached). I don't see the same feature.

!image-2021-09-01-09-39-24-929.png!!image-2021-09-01-09-40-02-598.png!!image-2021-09-01-09-40-15-577.png!!image-2021-09-01-09-40-48-123.png!

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Comment by Bryan Hilbert on JIRA:

I remember checking superbias subtraction long ago and not finding any problems. A mis-match between the bias used by Mirage to create the data and that in the superbias reference file is possible, in that Mirage works from a single dark current exposure when building the simulated data. So the bias level in that dark current exposure most likely has differences compared to the superbias reference file. But even with that, (group1_after_superbias_sub - group1_before_superbias_sub) - (group0_after_superbias_sub - group0_before_superbias_sub) should be zero in every pixel. Any differences in bias between the reference file and what Mirage uses should be subtracted out. 

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Comment by Alicia Canipe on JIRA:

I also thought I wrote a unit test for superbias that would catch a discrepancy like this if it popped up, but I didn't. I'll add a label to this ticket to remind myself to add this one. 

stscijgbot-jp avatar Oct 15 '22 00:10 stscijgbot-jp

Closing, as this is with simulations and was not reproduced.

nden avatar Apr 01 '23 20:04 nden