Sunday, March 10, 2013

First Steps Towards Bidirectional Path Tracing

I began my implementation of bidirectional path tracing by making a streamlined version of my existing renderer, Photorealizer. I removed little-used features, removed the core recursive path tracing code, cleaned up some of the code, and optimized some of the code. Then I added the following features:

Light and sensor geometry in the scene. Lights and sensors (i.e., the camera lens) are now associated with objects in the scene, just like other geometry. They can be stored in the same spatial data structures and intersected using the same methods as regular geometry. This is important for implementing bidirectional path tracing, and it is conceptually more similar to Veach's formal, mathematically rigorous definition of a scene.
• Unbiased image sampling and filtering. Now I can sample the image plane in any way I want (e.g., stratified sampling) regardless of the arrangement of pixels, and reconstruction filters can be any size and shape. Previously, I was using the common technique of reconstructing samples into pixels using a weighted average based on the filter weights at the samples (as described on page 308 in Veach's thesis, and in the book Physically Based Rendering).
• More robust BSDF system. Each BSDF now exactly follows the conventions of BSDFs as described by Veach (and others).  Reflection and transmission are now both handled in a unified way, so any BSDF can contain reflection or transmission. Each BSDF returns the value of the PDF when sampled. And I no longer have to flip the surface normal to be on the same side as the incident direction.
• Improved scene traversal and intersection system. I made a more robust system for preventing self-intersections, along with some performance improvements and code structure changes.
• Different materials for different instances. Each instance of a model in the scene can have can have its own material. More generally, I gave containers in my scene hierarchy the ability to override the materials of the objects within them.
Flexible texture mapping system. Any material property can now be texture-mapped easily. These properties can be set to either a 2D or 3D varying texture or a uniform color, which are all descendants of the same abstract texture class.
• Custom linear algebra library. I incorporated a new templatized linear algebra library that I wrote from scratch. The code is much cleaner than what I was using before, because there is only one vector class template and one matrix class template, and most of the operations are written as loops over the components (which I could potentially parallelize in the future).

An early test render showing the aperture geometry (the black hexagon).

No comments:

Post a Comment