Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[❔ other question] Sampling irradiancemeter correctly #445

Closed
MaximilianBader opened this issue Jun 1, 2021 · 6 comments
Closed

[❔ other question] Sampling irradiancemeter correctly #445

MaximilianBader opened this issue Jun 1, 2021 · 6 comments

Comments

@MaximilianBader
Copy link

Summary

Number of hitting rays using an irradiancemeter attached to rectangular shape is too low. How can I ensure that there are hitting rays?

Description

I have a very simple example scene and use a path integrator to render a point emitter, a small scattering disc and a sensor of very small size. As I'm interested in the irradiance, I chose an irradiancemeter as sensor type. I used the depth_integrator.py example to write my own code in Python and count the number of hitting rays (see code below). Unfortunately, due to the setup the number of hitting rays is 0 even though I massively increased the number of pixels per sample.

Due to the desired application of mitsuba for my reasons, I'm interested in using a sensor with no preset sensitivity pattern. I want to determine the overall irradiance on the sensor area. Do you have any suggestions, how I might be able to render this scene in such a way that I can ensure a sufficient number of hitting rays? Should I change the type of sensor?

Code

ray_tracing_example.py

import os
import numpy as np
import enoki as ek
import mitsuba

# Set the desired mitsuba variant
mitsuba.set_variant('scalar_mono')

from mitsuba.core import Float, UInt32, UInt64, Vector2f, Vector3f
from mitsuba.core import Bitmap, Struct, Thread
from mitsuba.core.xml import load_file
from mitsuba.render import ImageBlock

# Relative path to the scene XML file, appended to FileResolvers's search path & loaded into mitsuba
filename = r'mitsuba_scenes/example_us_scene_3.xml'
Thread.thread().file_resolver().append(os.path.dirname(filename))
scene = load_file(filename)

# Instead of calling the scene's integrator, we build our own small integrator
# This integrator simply computes the depth values per pixel
integrator = scene.integrator()
sensors = scene.sensors()
film = sensors[0].film()
sampler = sensors[0].sampler()
film_size = film.crop_size()
spp = 100000000000

# Seed the sampler
total_sample_count = ek.hprod(film_size) * spp

print("total_sample_count: " + str(total_sample_count))
print("sampler.wavefront_size(): " + str(sampler.wavefront_size()))

diff_scale_factor = ek.rsqrt(sampler.sample_count())

if sampler.wavefront_size() != total_sample_count:
    sampler.seed(0, total_sample_count)

pos = Vector2f(0,0)
#pos = ek.arange(UInt32, total_sample_count)
#pos //= spp
scale = Vector2f(1.0 / film_size[0], 1.0 / film_size[1])
#pos = Vector2f(float(pos  % int(film_size[0])),
#               float(pos // int(film_size[0])))

num_hitting_rays = 0
            
for i in np.linspace(1,total_sample_count):
    pos += sampler.next_2d()

    # Sample single ray starting from the camera sensor
    ray, weight = sensors[0].sample_ray_differential(
        time=0,
        sample1=sampler.next_1d(),
        sample2=pos * scale,
        sample3=sampler.next_2d()
    )

    ray.scale_differential(diff_scale_factor)

    (specta,active,aovs) = integrator.sample(scene,sampler,ray,active=True)

    if active:
        num_hitting_rays += 1

    sampler.advance()

print("num_hitting_rays: " + str(num_hitting_rays))

example_us_scene.xml

<scene version="2.0.0">
	<!-- Define basic path tracer modeling maximum of 7 scattering events -->
    <integrator type="path">
        <integer name="max_depth" value="3"/>
    </integrator>

    <!-- Define emitter-->
    <emitter type="point">
		<spectrum name="intensity" value="1"/>
		<point name="position" x="-0.01" y="0" z="-0.04"/>
	</emitter>
    <!--emitter type="point">
		<spectrum name="intensity" value="1"/>
		<point name="position" x="0.01" y="0" z="-0.04"/>
	</emitter -->
	

	<!-- Define sensor type at same position as emitter which detects the incoming irradiance-->
	<shape type="rectangle">
		<!-- Transform sensor type to correct direction -->
		<transform name="to_world">
			<scale x="0.00001" y="0.00001" />
			<translate x="0" y="0" z="-0.04" />
			<!-- rotate y="1" angle="-180" /-->
		</transform>
		
		<sensor type="irradiancemeter">
			<!-- Write to a portable float map containing all luminance values -->
			<film type="hdrfilm">
				<integer name="width" value="1"/>
				<integer name="height" value="1"/>
				<string name="pixel_format" value="luminance"/>
				<string name="file_format" value="pfm" />
			</film>
			<!-- Define easiest sampler of sensor for ray tracing-->
			<sampler type="independent">
                <integer name = "sample_count" value = "100000000000"/>
            </sampler>
		</sensor>
	</shape>

	<!-- Add simple scattering disk in origin flipped towards emitter and detector -->
	<shape type="disk">
		<transform name="to_world">
			<scale x="0.001" y="0.001" />
			<translate x="0" y="0" z="0" />
		</transform>
        <boolean name="flip_normals" value="true"/>
		<bsdf type="roughdielectric">
			<float name="int_ior" value="1.5"/>
			<float name="ext_ior" value="1"/>
		</bsdf>
	</shape>
</scene>
@Speierers
Copy link
Member

Hi @MaximilianBader ,

This seems to be a pretty simple setup. Could you take a look at the origins/directions generated rays and check that those make sense and will hit the scene's geometry (here a disk)?

@MaximilianBader
Copy link
Author

Hi @Speierers,

Sorry for not mentioning this. I checked the directions before and apparently the rays never hit the disk. As I tried to describe, I would like to avoid any sensitivity pattern on the sensor and thus, selected the irradiancemeter on a very small rectangular area shape. However, as the sampled ray directions do not hit the disk in this scenario, I see no other option than selecting a sensor with a directivity / sensitivity pattern instead. Do you see any alternative?

@Speierers
Copy link
Member

Hi @MaximilianBader,

I am not really sure what you mean by "pattern" in this experiments?

As I understand, you have a point light emitter and a roughdielectric disk placed in between the emitter and the sensor. For the sensor, you are interested in the overall irradiance hitting the sensor from all directions? If you are interested in the radiance coming to the sensor from a specific direction, you might consider using a different sensor then. E.g. the radiancemeter.

@MaximilianBader
Copy link
Author

Hi @Speierers,

Sorry for not clarifying this.

When I'm talking about a directivity pattern, I'm thinking about the probability of a ray being detected from a certain direction. To the best of my knowledge, a perspective pinhole camera only detects rays from certain directions, correct?

So when I'm utilizing a perspective instead of a irradiancemeter, I assume that the sampled rays are sampled only in directions which are physically reasonable. So while for the irradiancemeter rays are sampled uniformly in all spatial directions, for the perspective the direction is only sampled from physically reasonable directions. As a consequence, if I would replace the irradiancemeter with a perspective, I can expect an increase in hitting rays for the same scene. Is this thinking correct or am I mixing something up?

@Speierers
Copy link
Member

Hence if I understand correctly, "avoiding any sensitivity pattern" would mean that you would like to have a sensor that samples rays in any directions in the hemisphere above its main direction? That sounds like the irradiancemeter to me.

With the perspective sensor, you can specify the fov, which you could set to 90 although it migh break I have never tried.

With both sensor, it makes sense that a large portion of the rays will be "lost" in the void. You would need to implement a specific importance sampling procedure in the sensor to mitigate this issue.

You could also take a look at PR #143 which might implement the type of sensor you are looking for.

I will close this as it isn't an issue, but happy to still answer further questions.

@MaximilianBader
Copy link
Author

Hi @Speierers,

Thanks for the clarification. Indeed I was looking for a sensor that samples rays in any direction in the hemisphere and thus, chose an irradiancemeter. But I agree with you that it is reasonable that a large portion of the rays will be lost and I consequently have to implement some form of importance sampling procedure.

Thanks for referencing PR #143 which might be very helpful for my applications!

I agree that this was not really an issue, but rather a question up for discussion. I think, at this point you gave me all the advice that I require to think about possible implementations of the importance sampling procedure. Thanks bunch!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants