Wednesday, November 28, 2007

Shiver Development: Displacement

As I continue to polish the overall look of the globe - I decided to explore how to display the topology of the earth, which will become more important upon the incorporation of zooming in and out. Another plus for having a working shader pipeline is the ability to write vertex shaders in a way which changes the geometry of the given shader is attached to - better known as displacement.


Fig 1.1: Altitude map provided by the NASA Blue Marble project

Having an altitude map which precisely matches the texture maps I am already using is quite convenient - so there was no need to alter the image or the coordinate mappings in the my shaders. The concept of displacing geometry per vertex is a rather simple one:


Fig 1.2: Formula for displacement per vertex, where:

P0 = original vertex position
p1 = new vertex position
N = vertex normal position
df = normalized displacement factor
uf = user-defined scaling value


Fig 1.3: Diagram of per vertex displacement (Image credit: oZone3d.net)



Fig 1.4: No displacement / displacement comparison in Shiver (wireframe)

My next goal for Shiver is to implement a sunlight calculation system which will auto-generate a texture map accurately representing where how the sun is lighting the earth based on the time of day. Additionally, I plan on (finally) starting the preliminary visualization of the actual seismic events being geomapped.

Friday, November 23, 2007

Woven

I developed a small piece of software for a group presentation I had in Applications class that has to do with visualizing collaboration among 1st year students at ITP. Someone during the Q+A asked if the software was open-source and if so, where could the source be downloaded? The answers were "yes" and "soon, on my blog". Unfortunately, I don't really have the time as it stands to package the software up nicely for download - so I am just going to post the raw source code for now, with the hope that I can package it later on...



Source:
ProcessingMain.java
StudentDataParser.java
Student.java
Button.java
student_data.txt

Shiver Development: Event Mapping

A couple of new updates; one being that I have a new fog shader working on the globe - which adds to the effect of the atmospheric layering. The ability to program shaders and apply them to geometry is really what separates imagery that is noticeably fake and that which is nearing photorealistic. Real-time photorealism is the next big step in computer graphics - although you could make a very valid claim that photorealism hasn't been reached in pre-rendered form either.


Fig 1.1: Comparison between shader enhanced and non-shaded geometry

The next update in progress is actually a rather large one - that being that I can now plot events on the earth sphere, needing only an event's latitude and longitude values. This was actually a larger challenge than originally anticipated, as I didn't take into consideration polar coordinates in regard to texture mapping in OpenGL as it relates to sphere mapping. I also ran into an issue pushing and popping the transformation matrix - but that turned out to be an instance of not having certain draw events happening between the correct combination of push/pop commands.


Fig 1.2: Seismic events mapped on globe


Sunday, November 11, 2007

Shiver Development: Shaders in PyOpenGL

After a turbulent weekend trying to get shaders working with PyOpenGL - I finally got my first vertex and fragment files compiled and working inside Shiver. Without any documentation to bail me out of my usual jams with new code - I hacked away through the weekend until I stumbled upon a way of changing C-type function arguments, then converted variables into C-compliant data to be fed into the altered function in question. In English: I needed to convert data so it could be read and processed correctly.


Fig 1.1: Globe with atmospheric shader applied

So - in the lab I am often asked: "why are shaders important"? Shaders can provide surface appearance variance in regard to color, opacity, reflection, refraction, etc... even physical alterations to the geometry itself. In relation to what I am attempting to achieve - it is vital to be able to replicate light and physical phenomena that exists when viewing the earth from space.

My first goal was to get a representation of the atmospheric layer that wraps around the earth - a thin haze of a light blue hue. The trick is - the haze should not rotate along with the earth as the camera moves around the globe. To achieve this; a shader was created that calculates how light is hitting the surface at each given frame - then simply finds where the light drop-off occurs on the surface (the edges) and colors those areas a varying hue of blue based on how much light exists on a given fragment of the geometry. Since I applied the shader on a sphere and the lighting is both hitting the sphere straight on and is static in all attributes over time - we can assume perfect symmetry in the light distribution across the sphere at any given time, which allows for the Lambertian Reflection calulation of:



Where:
I0 = reflected intensity
Ld = light diffuse color,
Md = material diffuse coefficient

Once we have the reflected intensity of each fragment, simple logic can be installed within the shader that only allows for color to be applied a certain areas of intensity - and, since we are dealing with a sphere with light shooting right at it (symmetrical light reflection!) - it becomes simple to create a halo light effect.


Fig 1.2: Closer look of the glow-like effect of the atmospheric shader

The next steps with using shaders are to create a fog shader for the globe, as well as the potential for a very subtle bump-map, as well as other effects as time permits.