Standardize the correction for atmospheric path length for solar channels
As part of the RGB workshop 2025 it was proposed to conclude on the correction method to recommend for the normalization of the atmospheric path length for solar channels. Hence, I decided to dig into this and prepare some examples to give an overview. I put the results from my analysis in this issue such that we can discuss also within our community regarding the best approach.
In short we have two corrections available:
-
sunz_corrected- standard1/cos(sza)correction: https://github.com/pytroll/satpy/blob/main/satpy/modifiers/angles.py#L534 -
effective_solar_pathlength_corrected- following Li and Shibata (2006): https://github.com/pytroll/satpy/blob/main/satpy/utils.py#L270
Without any modification the "raw" version of the correction formulas look like this:
Setting correction_limit to a high value (120 degrees) where we have no meaningful VIS/NIR data is just a current hack to disable the any reduction/modification of the correction and thus get the "Raw" version.
Looking at the Cloud Phase RGB, this results in the following imagery (no Rayleigh correction)
To tackle the fact that 1/cos(sza) is undefined at SZA=90 and not well defined for the purpose beyond 90 degrees, we have a functionality to reduce the correction starting at a given angle (correction_limit) until it reaches 0.0 at max_sza. This reduction is applied by default in satpy for both sunz_corrected and effective_solar_pathlength_corrected using correction_limit=88 and max_sza=95. However, this results in an unnatural sharp transition at correction_limit (88 degrees) as shown here:
and results in this imagery
Although the left image using sunz_corrected now look much better and retains information beyond 90 degrees, a bright line is introduced parallell to the terminator (commonly visible also in animations). This bright line is also introduced in effective_solar_pathlength_corrected, where also some information at very high angles is lost compared to the raw version of effective_solar_pathlength_corrected above.
The bright line is introduced by the sharp transition at 88 degrees sen above. This can be mitigated by instead applying a constant correction beyond a given SZA. This can be achieved by setting max_sza to None, in which case the correction at correction_limit will be used for all angles beyond correction_limit. Using this approach and some testing, correction_limit=86 turned out to work quite well in the sense of not having bright artefacts close to the terminator:
resulting in this imagery
.
The same imagery but for the True Color RGB (including default Rayleigh correction) looks like this:
So to me the effective_solar_pathlength_corrected correction clearly performs better than sunz_corrected and can also be used as is. This is not in-line with our current implementation where the reduction between 88-95 degrees is applied by default. sunz_corrected can still be used, but I think the best results are achieved by using a constant correction beyond correction_limit.
Based on this I proposed to add the following information in the best practices document that will be one of the outputs from the workshop, which no-one objected to:
Solar channels should be normalized by the atmospheric path length. The Li and Shibata (2006) parameterization is recommended for this purpose:
correction_factor = 24.35 / (2*cos(SZA) + sqrt(498.5225*cos(sza)^2 +1))
If the alternative normalization using 1/cos(SZA) is used, a constant value should be used beyond ~86 degrees to avoid artefacts and sharp cutoff at 90 degrees.
Talking to other colleagues at the workshop (Meteo-France and CIRA), I also found out that they also use the Li and Shibata parameterization without any reduction/modification.
Hence, I propose the following changes in Satpy:
- Remove the default reduction applied to
effective_solar_pathlength_corrected. We can keep the functionality, but it should not be done by default. - Start transitioning from
sunz_correctedtoeffective_solar_pathlength_corrected(without reduction) in the composite yaml recipes. - Possibly change the default
max_szaforsunz_correctedfrom 95 toNoneand possibly reduce thecorrection_limitto 86 - not sure about this one though. Might need more testing
A while back I proposed refactoring the code for the atmospheric path length corrections (see #2955) so I suggest to tackle the first point together with that one. For the second point, I guess this can be done step-by-step by multiple developers working on the RGB recipes.
Thoughts?
Nice write up! Nice images! I wonder if this is controversial enough (I think I'm fine with all of it) that we might want to have which atmospheric path length method is used controlled by a satpy.config value. Defaulting to Li and Shibata of course. But doing something like this then begs the question, do we also make it easy to configure the correction limit and max sza with satpy.config options?
If the change to the Li and Shibata method by default does not seem controversial (or not very) then I think just changing the default everywhere and even making it called "sunz_corrected" would be fine with me. Or is there a more generic name for this operation? "correction of atmospheric path lenght"?
We could make a modifier wrapper class that:
- Wraps both sunz and li/shibata modifiers.
- Checks
satpy.configor YAML configuration keyword argument for which method to use. - Could also take correction limit and max sza kwargs from YAML or
satpy.configand/or only apply them to the "sunz" method whether they are provided or not. - Calls the appropriate method and returns the result.
This way users wanting to change which method is used doesn't require changing every recipe or changing YAML at all.
To make my usual comment: Just as long as this is only applied as an enhancement/modified when saving RGBs! We need to make sure that users who want the actual reflectance data either have it without SZA correction, or with the cos(SZA) correction. :)
(edit) For what it's worth, I agree with Johan's first two proposals. For the third one (max_sza and correction_limit) I also agree in principle but want to note that a max_sza of 95 is pointless anyway, aside from VIIRS/DNB none of the instruments satpy supports are sensitive enough to measure a signal at that SZA in any case ;)
Good point(s) @simonrp84. I think that is all doable if people approve of my idea. The composites could basically have 3 defined modifiers (if we'd like) sunz, pathlength, and config_pathlength. So all previous advice to users to do sunz_corrected in a DataQuery to get a / cos(SZA)-style reflectance would still be accurate.
Great investigative work @strandgren ! I'm in favour of replacing sunz correction wi the solar pathlength. However, replacing what "sunz_corrected" goes a bit too far imo, because I can think of at least one standard format where the sunz correction (the 1/cos(theta) one) is already included in the data, and is flagged as sunz_corrected.
What I also proposed in refactoring issue #2955 would be to refactor the !!python/name:satpy.modifiers.SunZenithCorrector class to support both variants and then define the corrections like this:
sunz_correction:
modifier: !!python/name:satpy.modifiers.SunZenithCorrector
method: "1/cos(sza)" # Probably need a better name.
optional_prerequisites:
- solar_zenith_angle
effective_solar_pathlength_corrected:
modifier: !!python/name:satpy.modifiers.SunZenithCorrector
method: "li_shibata" # Default?
optional_prerequisites:
- solar_zenith_angle
this wouldn't break anything and we'd just need to update the modifier name in the yaml recipes where we think it should be the case. I probably know too little about the design to know how this would fit with the proposed config option.
I also see the problem with data coming with the 1/cos(sza) normalization already applied. Not sure how to deal with that. Either keep as is or first de-normalize and then do the Li and Shibata normalization 😄
@simonrp84 I agree that we should not do this by default, only in RGBs or if the users explicitly asks for it when loading data. But I don't see that we necessarily need the 1/cos(sza) for a reflectance? This parameterization has been derived for RTM simulations and GCM with the idea of better taking Earth curvature and atmospheric refraction into account when estimating the atmospheric path length compared to the simple 1/cos(sza), which clearly becomes unphysical when approaching and exceeding 90 degrees.
@strandgren For quantitative use of the data I think it's important that the result becomes unphysical as the sun reaches the horizon: This indicates to the user that they're doing something wrong. Reflectance as a quantity makes no sense at very high SZA because you see a much larger proportion of diffuse rather than direct radiation (at least in the VIS channels).Having the "reflectance" explode at least lets users know that there's a problem. If we did apply a correction that looks nicer at these zenith angles then non-expert users may not be aware of the issue and may use reflectance data for quantitative use cases where it really makes no sense. Another point, reflectance is well understood to be either the ratio of outgoing / incoming radiance or that ratio corrected by cos(sza). It'd be incredibly difficult, even if it did make physical sense, to convince scientists that the Li and Shibata correction is still a reflectance!
Just to stress, I'm strongly in favour of applying this updated correction as you describe for RGBs (it looks great), but strongly against doing so for any quantitative use of the data at all. For that we need to stick to either no SZA correction or the simple cos(sza) correction.
Thanks for explaining @simonrp84, makes sense 🙂
I agree with @simonrp84 in that data that is not for visualisation should not use the effective pathlength for normalisation.
I’m just wondering how we can accommodate the two use cases in the same code. The method keyword you propose is good for pre-configured RGBs, but for interactive use, should we have a config item for that (defaulting to one over cos)?
(edit) For what it's worth, I agree with Johan's first two proposals. For the third one (
max_szaandcorrection_limit) I also agree in principle but want to note that amax_szaof 95 is pointless anyway, aside from VIIRS/DNB none of the instruments satpy supports are sensitive enough to measure a signal at that SZA in any case ;)
In the case of high thunder clouds I think 95 degrees is still plausibly within sun light. Might be useful to see those clouds and at least not mask them out in some cases.
I agree with @simonrp84 in that data that is not for visualisation should not use the effective pathlength for normalisation. I’m just wondering how we can accommodate the two use cases in the same code. The
methodkeyword you propose is good for pre-configured RGBs, but for interactive use, should we have a config item for that (defaulting to one over cos)?
@mraspaud what do you mean with interactive use? Like this scn.load(dataset, modifiers=["sunz_corrected"]?
(edit) For what it's worth, I agree with Johan's first two proposals. For the third one (
max_szaandcorrection_limit) I also agree in principle but want to note that amax_szaof 95 is pointless anyway, aside from VIIRS/DNB none of the instruments satpy supports are sensitive enough to measure a signal at that SZA in any case ;)In the case of high thunder clouds I think 95 degrees is still plausibly within sun light. Might be useful to see those clouds and at least not mask them out in some cases.
Yes, and I don't really see the need for phasing out the correction with the default reduction that we have now. If the correction factor is kept constant (as in my experimental example above) the signal will anyway be phased out to zero with the decreasing signal.
@mraspaud what do you mean with interactive use? Like this
scn.load(dataset, modifiers=["sunz_corrected"]?
yes
@mraspaud what do you mean with interactive use? Like this
scn.load(dataset, modifiers=["sunz_corrected"]?yes
I assumed that this pre-configured modifier was used in that case: https://github.com/pytroll/satpy/blob/248be12975132c98bfce5bc43b5aaa0905d49254/satpy/etc/composites/visir.yaml#L9 (or an instrument-specific one if available)? In that case it should be possible to define a method in the yaml, no?
yes, that should be fine.
I’m now just wondering if method is really needed for the sunz_correction case: the 1/cos(sza) is so widespread that it can probably be it’s own thing…
Hey guys! Is that the reason why I get really different results e.g. in geo color for GOES? This is my version
and this is what I get from CIRA slider (bounds are not exactly the same)
@guidocioni No I don't think so. The topic in this issue only affects the imagery at very low solar zenith angles (> 85 degrees) and this scene seems to be well illuminated. I think the reason is different methods used to simulate the green band which is not available on GOES/ABI.
Related PR (although only for day cloud type and day cloud phase): https://github.com/pytroll/satpy/pull/3032
For information: in DWD, we have been Li and Shibata with a constant correction since April 2023 for all our single-channel solar imagery and daytime-only RGBs.
I added some day cloud type and day cloud phase images with Li and Shibata with the constant correction (max_sza: !!null), varying only correction_limit, at
https://github.com/pytroll/satpy/pull/3032#issuecomment-2880345934.