Debug optical boundary crossing
While working on implementing optical boundary crossing, I've encountered some failing tests for the optical PrimaryGenerator based on vecgeom + g4 versus orange + geant4. The vecgeom builds have slightly less step counters, and sometimes different step iterators than the orange + geant4 version.
This branch isolates some of the recent changes (e.g. vecgeom normals, optical surface physics) so that it's just moving photons across a surface with no other physics involved.
Closes #2041
Test summary
4 953 files 7 878 suites 16m 20s :stopwatch: 2 034 tests 2 005 :white_check_mark: 24 :zzz: 5 :x: 27 877 runs 27 745 :white_check_mark: 99 :zzz: 33 :x:
For more details on these failures, see this check.
Results for commit 4f08d538.
:recycle: This comment has been updated with latest results.
Were the differences you were seeing in the docker build? The primary generator test results here look consistent, though for orange I think that test was only run with the single precision build, which doesn't check the number of steps or step iterations.
I'm seeing build errors in some of the build-spack tests: https://github.com/celeritas-project/celeritas/actions/runs/18468836404/job/52618665145?pr=2038#step:17:599
I've been comparing against some of the build-docker test failures as well, which build with orange and don't have these failures. I can move the PR out of drafts to run those as well
I think that test is failing in the same way at least in the spack build tests (we "continue on error" for vecgeom-g4 with Geant4 versions less than 11.0, so unfortunately the tests with earlier Geant4 versions were failing as well even though it looks like they pass).
Gotcha, thanks for the heads up that's helpful to know!
Doing some more testing I found that the optical models were getting override in the LArSphereBase test, and after some searching it looks like only the Rayleigh model is causing the discrepancy. I know the inputs for that model got refactored a bit so I'll check that code next.