yt
yt copied to clipboard
Ray object dl for SPH data: interpolate kernel table for (b/hsml)**2 instead of b/hsml
Fixes a (possible) bug in the calculation of path lengths dl for SPH/non-grid data in the YT Ray objects. This affects e.g., column density and absorption spectrum calculations, like those done in the Trident package.
PR Summary
Replace
dl = itab.interpolate_array(b / hsml) * mass / dens / hsml**2
by
dl = itab.interpolate_array((b / hsml)**2) * mass / dens / hsml**2
in the _generate_container_field_sph
method of the YT Ray object (https://github.com/yt-project/yt/blob/main/yt/data_objects/selection_objects/ray.py).
This is the proposed fixed for the possible bug mentioned in issue #4781.
In short, the _generate_container_field_sph
retrieves path lengths dl for normalized SPH kernels by linearly interpolating values calculated for a smallish set of (impact parameter / smoothing length) values. However, if I'm not mistaken, the table actually stores those dl values for different values of (impact parameter / smoothing length)^2. That means we need to input (b / hsml)**2 into the interpolation function, not (b / hsml).
PR Checklist
- [ ] New features are documented, with docstrings and narrative docs
- [x] Adds a test for any bugs fixed. Adds tests for new features.
The only documentation issue I can see here is that if this was indeed a bug, we should probably send out some message warning people who might have used/be using Ray objects for papers about the issue. As for tests, @chummels metioned a possible test on the slack channel involving SPH-ifying a grid dataset and comparing the surface/column densities retrieved from both.
Hi! Welcome, and thanks for opening this pull request. We have some guidelines for new pull requests, and soon you'll hear back about the results of our tests and continuous integration checks. Thank you for your contribution!
I believe that the analysis in #4781 is correct! Thank you for your detailed report, @nastasha-w -- this is a great find, and very appreciated. I am going to mark approve, but I think for a change of this magnitude (even if it is just one line!) we do need another set of eyes.
This is completely out of my expertise so I'll leave it to others to review; I just wanted to ask to anybody who'd want to push the button to please triage this to 4.3.1 or 4.4.0 depending on the expected impact ! thanks !
@matthewturk Thanks! It agree it's best to be sure about this. I did add a test for Ray objects to this pull request. I don't think it's entirely complete; for one thing, the mass / density factor in the dl
calculation is always 1 in the test mock dataset I used. However, it does pass on this branch and fail on the main branch (2 main commits ago anyway).
The new test is in test_rays.py: test_ray_particle2()
. It just uses a bunch of asserts, and returns None
if all tests are ok. Is there anything else I need to do to incorporate this into automatic testing?
pre-commit.ci autofix
@matthewturk @jzuhone @langmm @chummels could you take a look at this pull request and approve it or give me some feedback? This slipped my mind for quite a while, but I do think there is an error in the code that this fixes. I merged in the current (well, very recent at least) main branch, and the tests are passing now.
@neutrinoceros can I ask what the 4.4.0 milestone means? The github docs didn't seem very helpful on this.
It means I think it's reasonable to expect this will be merged before we release yt 4.4.0, but it's more of a wishlist and less of a hard requirement.
@nastasha-w I am satisfied with this. @chummels let us know what you think, @nastasha-w added some tests which I think are helpful.
@jzuhone thanks!