Skip to content
Trisha Lian edited this page Aug 4, 2019 · 52 revisions

Simulating three-dimensional scenes enables us to learn more about the impact of depth on image encoding. Further, it provides a way to create stereo image pairs for understanding binocular vision.

The scene eye model uses the iset3d methods to trace rays through a model of the human optics optical image, extending the basic scene and oi methods. The sceneEye class contains the parameters necessary to use iset3d to render a 3D scene spectral radiance through a lens model of the human eye and produce a retinal spectral irradiance (optical image). That spectral irradiance image is in a format to compute the cone mosaic absorptions.

A full description of these tools can be found in our paper: Ray tracing 3D spectral scenes through human optics models.

3D Scenes

The ISETBIO repository includes a small number of scenes that are useful for testing the sceneEye object. A couple of these scenes require you to have RemoteDataToolbox installed. Several scene parameters, such as the size and distances of planes, lighting, or textures can be changed by calling the scenes with different inputs.

To see a full list of the currently available scenes as well as examples for parameters that can be modified, see t_renderAllScenes.

PBRT-V3-Spectral Docker

The critical rendering and optics calculations are within the source code pbrt-v3-spectral, our modification of PBRT. This code is compiled in the docker container and contains the ray-tracing code that reads in parameters of the human eye and traces through them. The tracing is done through refraction of rays as they travel through each surface and medium and the Navarro model of the human eye. The docker commands that build the container for pbrt-v3-spectral are here.

The implementation for the Navarro eye required both spherical and biconic surfaces.

Rendering speed

Physically accurate ray-tracing speed depends on

  1. The number and speed of the CPU cores in your machine
  2. The size of your rendered image (resolution)
  3. The parameters you set for the render quality (numRays, numBounces).

It is not a GPU limited system.

We recommend rendering small images with a low number of rays before rendering your final image. Rendering the examples shown here take a less than a minute. A high quality image can take hours.

We are always looking for ways to speed up rendering; but, our primary focus is simulating realistic retinal images. As a result, we are not pursuing rendering engines that achieve a nice looking picture that is not physically accurate. Instead, we are working on tools that render many images in parallel on Google Cloud. This speeds up rendering small video sequences, stereo images, or images from camera arrays.

Rendering with Google Cloud

In order to speed up rendering, we have set up our tools to work with Google Cloud. The toolbox we have written that handles this interface is called isetcloud. When rendering on the cloud, we setup the sceneEye object identically to normal rendering. However, instead of calling the render() command, we instead call the function sendToCloud with the sceneEye object and a gcloud object from isetcloud. We then call the render command from the gcloud object. Once the rendering has finished, we call downloadFromCloud to pull the rendered images from the cloud. This process requires more steps than our normal rendering procedure; you can find example scripts to refer to here (xxx).

History

Starting with the RenderToolbox project, the Brainard Lab began using computer graphics to create controlled and yet realistic input for both psychophysical experiments and for simulation. The Wandell Lab joined this effort, focusing on integrating Physically Based Rendering into the toolbox that could serve as quantitative three-dimensional scenes to serve as input to is now ISETCAM.

The RenderToolbox4 toolbox has additional capabilities that support 3D model generation; the pbrt2ISET toolbox is a restricted, and therefore simpler, effort in the same direction.

Clone this wiki locally