eleanor
eleanor copied to clipboard
PSF rewrite: Zernike model issues, optimization framework, background
PSF models are not yet robust enough to merge into master, but some things worth looking at/discussing and a couple other important updates:
- PSF optimization has been fully switched to scipy+pytorch, meaning it uses automatic differentiation like the old tf1 interface did. I think it could still be made more efficient - tf1's static computation graph approach seems to have been completely dropped from tensorflow and all similar packages, but there's likely still a way to provide that - but it's comparable to the version on master now, more able to find the true optimum than the "tftoscipy" branch approach, and compatible with Python 3.8. The bounds on optimization parameters got dropped in this process but it doesn't seem to affect the goodness of fit, and if need be we can use
torch.clamp
. - Pointing and background information are now incorporated: instead of specifying the xc and yc of targets, it suffices to provide the parameter
bkg_mag_cutoff
topsf_rewrite
, and the xc and yc are found from applying the source's pointing model to a Vizier query cut to only include stars brighter than the cutoff. In principle the magnitudes also give an expected average flux from each target, but I haven't had favorable results using this. - Airy and Zernike models - the Airy model weirdly doesn't work at all, but that should be a simple bugfix if desired. The Zernike model works well for a single target. In a crowded field, it still provides better MSE loss than the Gaussian but a less visibly coherent light curve/higher CDPP. This suggests the MSE loss alone may not be the best metric (some sort of cross-correlation to fluxes?), and also that some of the simplifying assumptions I made for the Zernike fit to be fast (e.g. the Zernikes don't change with xshift/yshift other than what can be captured with a regularizing exponential) might not hold - experimentation with this to come in the next few days!
- Refactor so that it's easier to add models and optimizers now, less redundant code in both models.py and targetdata.py:psf_lightcurve.
- The addition of prf.py, mostly with Ben's notebook from online.tess.science - it's called by the Zernike model, but I think what would be ideal is each model fits to it and uses that as its starting point/we force model parameters not to drift from it too much.
Cool! Let me know when it's ready for me to take a look at it, and what parts in particular. And if you have a test notebook that demonstrates the changes, that's even better!