PyRADS icon indicating copy to clipboard operation
PyRADS copied to clipboard

Cleaning up codebase and restructuring OpticalThickness calculation

Open AndrewILWilliams opened this issue 3 years ago • 3 comments

This PR fixes a couple of the issues in #16 related to unused code and global instances.

Changes:

  1. use_numba option deprecated. I think it might be possible to fix it in future, but for now the numba functionality certainly is not robust.

  2. Lots of tidying in Absorption_Crosssections_HITRAN2016:

  • Removed ClimateUtilities import, added import numpy instead
  • Total internal partition functions QGenericLin and QGenericNonLin have been replaced with one function which takes the power law exponent as an input. I've made this exponent part of the molecules dictionary which is defined near the top.
  • All global instances removed! This makes the code much more readable and easy to debug, and also is one step towards restoring the numba functionality, although not quite there yet...
  • Removed Ray P's old plotting and analysis functions
  • Removed long comments (from Ray I think) which give way too much detail
  1. Call loadSpectralLines() BEFORE the main loops in OpticalThickness.py
  • Now it's only called once per calculation, as required.
  1. Deleted ClimateGraphics.py, ClimateGraphics_MPL.py, DummyGraphics.py and ClimateUtilities.py
  • I had a look through and none of it was being used anywhere, even some it was sometimes imported.
  1. Removed unused functions in phys.py
  • Lots of bloat here that was inherited from the original PyTran script I think.
  1. Removed object_helpers.py
  • Again, seems to be legacy code from PyTran.

The tests all run okay, I'd recommend running them yourself on your platform first though and checking that you get the same numbers compared to the base version before merging, just as a sanity check! :)

Also, as far as the user is concerned there aren't any changes to the API either, it's just cleaner behind the scenes!

AndrewILWilliams avatar Mar 04 '21 17:03 AndrewILWilliams

@ddbkoll I think you're correct about speeding up the computeAbsorption() function using numpy as well, I'll look into that tomorrow!

On that note though, could you explain what's happening towards the end of computeAbsorption() and computeAbsorption_fixedCutoff()? Particularly these lines?

iw = int(len(waveGrid)*(n-waveStart)/(waveEnd-waveStart))
nsum = int(nWidths*gam/dWave)
i1 = max(0,iw-nsum)
i2 = min(len(waveGrid)-1,iw+nsum)
if i2>0:
    dn = numpy.arange(i1-iw,i2-iw)*dWave
    abs = S*gam/(math.pi*( dn**2 + gam**2))
    absGrid[i1:i2] += abs

AndrewILWilliams avatar Mar 04 '21 19:03 AndrewILWilliams

abs = Sgam/(math.pi( dn2 + gam2)) evaluates the absorption crossection due a single line as a function of wavenumber, assuming Lorentz line shape. This should actually be a computationally more-complicated Voigt line shape (add this to pyrads to-do list) , but for the purposes of computing OLR a Lorentz line shape seems to do pretty well.

absGrid[i1:i2] += abs is then the net absorption, computed by adding up lots of individual lines.

The rest is just setting up a temporary wavegrid dn which is centered on the line center, doesn't run out of the bounds of the overall waveGrid (hence the min/max), and goes "nWidths*gam" cm-1 out away from the line center.

danielkoll avatar Mar 12 '21 12:03 danielkoll

Ok, thanks! That makes it a lot clearer now. Under what conditions would i2>0 not hold then? It seems like as long as you have a non-trivial wavegrid then it will be fine?

I'll have a think about whether it's feasible to use numpy to remove the loop in this function, might be a bit tricky/memory intensive to set up all of the temporary wavegrids at once though.

AndrewILWilliams avatar Mar 12 '21 12:03 AndrewILWilliams