Add Stüve, Emagram plots
This PR enables Stüve and Emagram plots, which are themodynamic diagrams like SkewT.
Description Of Changes
Two new classes, named Stuve and Emagram, are constructed as child classes of the SkewT parent class. They create Stüve diagrams and Emagrams, respectively. As child classes, they inherit plotting capabilities from SkewT (e.g. data, wind barbs, dry and saturated adiabats, and mixing ratio lines).
I confirmed that Advanced Sounding example works with the new Stuve and and Emagram classes as drop-in replacements for SkewT.
This Closes #837 and addresses the Stuve request in #418.
Checklist
- [ x] Closes #837
- [ ] Tests added. The
tests/plots/test_skewt.pycould be repeated forStuveandEmagram, but this has not been done. - [ x] Fully documented
Thanks for the PR, and congrats on your first contribution here! This is super helpful stuff. We can start to review the implementation in the coming days/weeks. We will need new tests for these classes, do let us know if that's a pain point for you.
Also, if it's helpful, most of our linting headaches can be automatically fixed or made known before committing by following the Code Style steps within the MetPy dev guide!
@dcamron, I added pytest files for the new classes. The CI tests pass on many platforms and Python versions, but not all. The failed test artifacts that I checked showed a couple single-pixel shifts, nothing a user would ever notice. From a user perspective, all of the test results look fine to me. This is the limit of what I can provide, so I hope that someone else more familiar with pytest --mpl can look at this or relax the test failure thresholds.
I created the baseline images with Python 3.12.5, Matplotlib 3.9.2, and latest MetPy (with modifications) on MacOS.
Here is an example of a test result that was classified as a failure, but looks fine to me. This is Conda Tests / macOS 3.10.
Baseline:
Result:
Difference:
Yeah, that kind of failure is something where I double check the uploaded test image, and then just set the threshold to something slightly above the failure RMS, which should then have the test pass. If you're not up to doing that, we'll be happy to make those updates.
I am not familiar with how to change pytest failure thresholds. I would appreciate if you could do that. Thanks
@dcamron and @dopplershift, Is there anything that I can do to get this PR merged? Thanks!
Thanks for the ping! We would love to include this in an upcoming MetPy feature release, so look forward to a review in the coming weeks.