mitsuba3
mitsuba3 copied to clipboard
[Feature] Hemispherical distant sensor plugin
Description
This PR adds a hemispherical distant sensor, which is similar to an adjoint to the envmap emitter. It records radiance leaving the scene in a hemisphere. It can be useful to get a qualitative idea of the effective BRDF of a surface with complex geometry.
Code is mostly taken from the distant sensor plugin, and I don't know to what extent it would be desirable and/or doable to move common parts to a library header (in the end, only the constructor and direction sampling code are different).
An example of RGB output (a simple rectangular surface with a roughconductor BSDF; coloured dots each correspond to a directional emitter in the scene; note that pixel index origin is taken at the bottom left of the image):

To do
- [x] Improve docs with a example of output (also add up direction on it) and a schematic to explain film-hemisphere mapping
- [x] Update docs with actual test scene
- [x] Add tests
Testing
A set of tests similar to those written for distant is included. We test for object construction, ray sampling and overall result correctness in a few scenarios.
Checklist
- [ ] My code follows the style guidelines of this project
- [ ] My changes generate no new warnings
- [ ] My code also compiles for
cuda_*andllvm_*variants. If you can't test this, please leave below - [ ] I have commented my code
- [ ] I have made corresponding changes to the documentation
- [ ] I have added tests that prove my fix is effective or that my feature works
- [ ] I cleaned the commit history and removed any "Merge" commits
- [ ] I give permission that the Mitsuba 2 project may redistribute my contributions under the terms of its license
Thanks for this PR @leroyvn ,
Would it make sense for hdistant to have a ref<DistantSensor> m_child member that you could use to call the eval and pdf routine? Then all you would have to do is redefine the sampling methods.
On the other hand, this is not a lot of code, I don't think it would be worth it to share some of this in a header file.
Would it make sense for
hdistantto have aref<DistantSensor> m_childmember that you could use to call the eval and pdf routine? Then all you would have to do is redefine the sampling methods.
I hadn't thought of that: actually I think these two methods will be different from the distant implementation. Then it probably makes more sense to keep those separate. Side note: Shall I implement these as well? They make a lot of sense in the context of using those distant sensors with a light tracing integrator.
Okay then, let's keep those seperate 👍
It would be great to have them yes. Otherwise we will need to revisite those plugins once again when the light tracer has landed.
I looked into it and it seems that the DirectionSample3f struct is currently only fit for emitter sampling. I guess the emitter member should become endpoint, but this would be beyond the scope of the PR (and overlap with the light tracer work). Leaving the emitter field unset seems however not so clean to me...
Makes sense. @merlinND is this something you encountered in your work on the light tracer?
Hi @Speierers, I just pushed an update with tests and docs. It now requires a commit on the data submodule for which I'll submit a PR. The documentation notably explains how to properly orient the film (see below, please tell me if it's not clear).

Hi @leroyvn,
Great illustration for the doc! I am just a little confused with the targer, origin, up label. IMO it is hard to read what they correspond to on the illustration. Also on the bottom row, does the orange arrow represent the up vector?
I am just a little confused with the
targer, origin, uplabel. IMO it is hard to read what they correspond to on the illustration.
I'll make a new proposal, hopefully clearer.
Also on the bottom row, does the orange arrow represent the up vector?
Yes, and the direction pointed by target - origin is the "bullseye" at the centre (I used something close to the conventions used in technical/industrial drawing).
Maybe something like this?

So this is what we have now, with some text explaining what the orange markers on the plots are about.

Hi there! I'll sneak in to this discussion 😋 I like the illustrations, but I am a bit confused! Is the hemisphere located above the blue plane in the target direction? Is it recording light that was emitted and then bounced from the blue glossy plane? Or is it radiance that left the emitter directly? And where do the light sources actually point - or are they point light sources?
Hi @tomasiser, it's indeed clearer with the full docs. The sample scene has three directional emitters which illuminate a rectangular patch with a diffuse, non-absorbing BRDF. The sensor records exitant radiance (in that case, reflected by the surface, but it can be something more complex) in a (infinitely distant) hemisphere pointed by the yellow vector on the schematic. So in this example, it is "above" the blue plane. This vector is set by the target and origin params of a look-at transform passed to the to_world param of the sensor, and it is independent from the reflecting shape.
hdistant is an extension to the distant sensor, which itself is the adjoint to the directional sensor. You can see hdistant as the adjoint to the envmap emitter (although they do not cover exactly the same angular region).
In practice, hdistant is useful to get a visual representation of the reflectance pattern on a complex surface. You must illuminate the scene appropriately, i.e. with a directional emitter, in order to retrieve meaningful reflectance values (see the reference paper by Nicodemus (1973) for a thorough introduction to surface reflectance measurement principles). Its weaknesses are:
- no importance sampling;
- no control on the positioning of sampled directions (the pixel grid won't align with a particular direction and the film will be sampled continuously).
Parametrising distant sensors is a good way to address these limitations.
@leroyvn I like very much the new illustration :)
Hi Vincent,
I just took another look at this PR and thought a bit more about the 'shape' target feature common to this sensor and 'distant.cpp'.
A few thoughts:
-
Having this shape feature seems reasonable, and I can understand why it is needed. However, the documentation could be improved. You could explain some of the subtleties and mention that it only makes sense to use truly flat surfaces here.
-
Expanding into a specialized template-based implementation of the emitter based on a property is warranted in some very rare use cases (like having a bitmap that could be monochromatic or spectral, i.e. the underlying representation changes completely). It makes the code quite a bit more complex than is warranted here, and I don't think you are really gaining any performance improvements. Could you convert this into a single class that uses
ifinstead ofif constexpr (Template Parameter)in the sampling routine? -
I don't think I fully understand why the
sample_ray_differentialis so complicated (using another methodsample_ray_dir_originthat may sample the shapes multiple times altogether.) Can't we use the single ray origin position foro_xando_yand just shift the direction?
Thanks, Wenzel
Hi @wjakob, sorry it took me a while to get back to this. I tried to address your comments:
- I rolled back to a template-free implementation;
- I added a warning to the docstring so as to make it clear to users that they should basically use rectangles and disks to control ray targets, although other fancy flat surfaces would work;
- I rewrote the
sample_ray_differential()method: shape sampling is now done only once.
If this is fine, I'll also propagate the documentation update to distant (alongside some cleanup and updates we discussed last time we talked).