Piff
Piff copied to clipboard
Add a normalization option
Right now our PSF models are all normalized to have unit integral.
@erykoff has explained that this isn't the right thing if you expect to get accurate photometry at the end. Here is the procedure he laid out for doing the aperture correction:
- Take a model
- Apply it to all the stars on a ccd
- Measure the psf flux with the “unnormalized” psf
- Measure the aperture flux in the reference aperture
- normalization is the median/mean/clipped mean/favorite statistic of the the ratio between the unnormalized psf flux and reference aperture flux
My proposal is to do this at the very end of the fitting process of an exposure on all the stars used for fitting. This will set an overall normalization number for that exposure, which can be saved at the PSF level and applied when drawing the PSF using the final model.
This seems independent of any particular PSF type, so I think this can be a top-level field, which could look like this in the config:
normalization:
type: Aperture
diameter: 22.22 # radius also allowed
units: pixels # arcsec also allowed
@beckermr @esheldon @brianyanny
Further info from Eli on Slack:
As an update to above, step 5 will break if you have systematic background errors. Which we do. So you want to take the brightest x% when you do this, because those are less affected by backgrounds. Notably, the pure-psf-model version is immune to this and so might actually be better!
We could also consider a different normalization type that does the pure psf model version, which Eli thinks should work, but Lupton thinks won't. Then it could be relatively easy to compare and see which is better.
The other possible algorithm looks something like:
- Take a given Piff model for a specific location.
- Draw it at fine resolution (or maybe just don't care about the accuracy issues due to finite pixel scale.)
- Integrate the profile within aperture. This gives a number modestly less than 1. Call this f_ap.
- Renormalize full profile by 1/f_ap.
More useful information about this here: https://project.lsst.org/meetings/law/sites/lsst.org.meetings.law/files/Project%20Pland%20for%20Photometric%20Calibration%20-%20Eli%20Rykoff.pdf#page=37
We could also consider a different normalization type that does the pure psf model version, which Eli thinks should work, but Lupton thinks won't.
LLOL
normalization is the median/mean/clipped mean/favorite statistic of the the ratio between the unnormalized psf flux and reference aperture flux
Is the norm supposed to be constant per exposure, or should it be allowed to vary in some way?
It should at least be per-ccd, not per-exposure. In the Rubin stack it can vary at sub-ccd scales, in sextractor/psfex it can't.
sorry yes. I meant CCD
I am not convinced that allowing to vary at sub-ccd scales is particularly useful overall, as it creates more degrees of freedom for things to go wrong. And it's also something that might be more important for HSC than DECam.