Monday, March 24, 2008

Python vs. Java vs. Python/Weave

In case you haven't noticed, I am a rather big proponent of using Python whenever possible. Between Python's flexibility and just the enjoyment I get out of writing in the language, I generally look to use Python before turning to a static language such as C/C++ or Java. The times I have had to turn to a compiled language are usually linked to when I need computational speed.

There are several ways to get Python moving faster - some a bit more ugly than others; code speedups with Python are usually about trading in readability for performance, but that doesn't mean it can't be straight-forward in implementation. In the past, I have used the usual bag of Python tricks: list comprehensions, method/variable assignment outside loops, generators, and countless more pure-Python solutions.

Recently, I stumbled upon Weave, a module existing inside the popular SciPy library which allows for C code to be mixed with Python code. Using Weave allows for the coder to stay inside the bounds of Python, yet use to power of C to compute the heavier algorithms. Not only does it provide a speed increase... it beats Java. Consider this (nonsensical) algorithm comparison between pure Python, Java, and Weave + Python:


Python Example:

Time: 1.681 sec.


Java Example:

Time: 0.037 sec.


Python + Weave Example:

Time: 0.017 sec.


Using identical algorithms in all three tests; using Weave with Python was nearly 100 times faster (98.9x) than using pure Python alone. In comparison to Java, Weave + Python was a shade over 2 times faster (2.1x) than Java. Although the Python/Weave code itself is bigger than the pure Python and Java examples - the speed it provides is absolutely phenomenal, without ever leaving your .py file.

Hopefully, I'll be able to use the power of mixing C with Python for more elaborate purposes - but this test alone is enough to get me excited about future projects with Python.

Wednesday, March 12, 2008

DayBRIGHT: Gradient / Render Integration

I have now managed to get DayBRIGHT to render out a full animation using a gradient as the light source. While the animation scene itself is rather basic, with three spheres inside a box, it demonstrates some of the primary goals of the project.

Now that the gradient data structure integrates with the rendering process, it now becomes a matter of writing the translators for each type of media I'd like to use - as well as developing a more interesting scene to light.




Fig 1.1 + 1.2: Animation using displayed gradient as lighting source over time


Indirect surface shader code:

Monday, March 3, 2008

DayBRIGHT: Post-Processing

Changing gears a bit, I decided to begin writing in some post-image processing functionality to DayBRIGHT to handle the final image output. The first post-process was writing a color noise filter that blended a 1:1 ratio of random colored pixels with each rendered image . This is done to counter the fact that the illusion of computer graphics being reality is often blown due to a discrepancy in noise artifacts across the image, most commonly noticed when graphics are composited with film or video footage. Even without footage, the human mind does not normally process perfectly smooth color and shape with reality. Reality rarely has instances of complete visual fidelity, at least from an everyday human perspective.


Fig 1.1: Simple color noise composite process

Another post-process that is necessary was gamma correction. The algorithm resulting from the equation is straight-forward, where a gamma corrected set of values from 0-255 is created, then each RGB value from the given image is changed to the gamma corrected equivalent. Below is the simple math equation for gamma correction and corresponding algorithm in (non-optimized) Python code:


Fig 1.2: Gamma correction equation


Saturday, March 1, 2008

DayBRIGHT: Renderman Rendering

Below are some test renders coded through Renderman and woven into the DayBRIGHT graphics and data handling subsystems. Using the classic Cornell box, I was able to successfully simulate global illumination in combination with image based lighting. The photon calculation isn't as smooth (or as fast) as I'd like it to be - but optimization will be taken care of later once I can generate reliable image sequences.


Fig 1.1: Global illumination / color bleeding


Fig 1.2: Image-based lighting with accurate shadows. Shadows no longer gradient of grays, but rather mixed with color pertaining to casting objects.


Fig 1.3: Classic Cornell box setup with pronounced color bleed