Wednesday, October 31, 2007

Shiver - Final Proposal

A few posts back - I wrote about developing a seismic data visualizer for my ICM final. This is still the case - however, the details involving the execution of the project are now very different. The most notable change involves adding several physical components to the project, meaning; this project will now be both my ICM and my Physical Computing project for this semester. Excited? Splendid - now on to the details...


Technical
On the ICM front - I made a decision a week ago that I would in fact use Python as my primary programming language opposed to my original choice of Java. Why? Well, as I mentioned in the past - Python is slow in regard to intensive computation in comparison to compiled languages such as C/C++ or Java. However, where Python shines is in its fast development time - and through its flexibility to glue together various components effectively. Without a doubt, hitting the visual high-bar I am aiming for in regard to the graphics in this project will put an enormous strain on Python - however, much of that strain can ideally be handled through C code, if necessary. I will be using an OpenGL binding called pyOpenGL - which is a mature library with strong documentation, so I should have information/learning support when I need it. For the GUI components, I will be using the wxPython toolkit - which allows for native OS widget use, so the program will automatically reflect the standard appearance of whatever OS the program is being run on. Very slick.



Fig 1.1: Shiver in early development, click to enlarge.


Representation
As for how the data will be represented, it will be done so on a 3D globe - much like applications such as Google Earth and NASA's World Wide Wind. When the applications gathers seismic data activity from the USGS - each event will be mapped to the globe according to its latitude and longitude values. Based on an event's Richter Scale value - the event will be visualized along the nearest normal to in correlation to its lat/long. mapping. Events can be selected through an easily organized directory structure tree, or on the globe itself. When selected - various attributes are displayed, as well as a simulation option - which leads us to the physical computing side of the project...


Physical
After sitting down with fellow ITP'er Sunghun Kim, we decided that we would like to team up and pursue a physical representation of seismic data that can be partnered with software I am developing for my ICM final. This idea will involve creating a 2D representation of the seismic events - dictated by the Richter scale value of each event. This will ideally result in a mesh being created by rods pushing and pulling at a rubber-like surface. Each rod will be driven of an average value of pixel color values dependent on the ratio between the rod count and the image resolution. Once a 2D simulation is established, it will ideally branch out to a 3D grid - but that is likely for another semester.

Thursday, October 25, 2007

Elevator Previsualizer

This is a midterm project I worked on with a talented group of people in Physical Computing. Follow the link for more:

Elevator Previsualizer Main Site

Wednesday, October 17, 2007

Phys. Comp Midterm: Implementation / User Observations

For our midterm in Phys. Comp - my group decided to take on the task of breaking up the boredom which happens when waiting for an elevator. Petra, Sunghun, and myself set up what we had completed for the project this far the first time -- with mixed results. It is clear that we have a lot of implementation issues to resolve before getting the type of user interaction data we need for the later stages of polishing usability.


Fig 1.1: Will (me) interacting with camera.

Implementation Issues/Observations

1. The projector, if angled from the ceiling, will display a distorted shape due to the light being shot at an angle.

2. The image being shot through the projector does not immediately fit in the space of the door. The video can be clipped, but the light (although black) still shows up.

3. Even if the projector is raised up on the ceiling; the light from the projector will be blocked by people standing directly in front of the door.

4. There is no (reasonable) way of preventing significant light pollution at any given time because we are unable to strictly control the pubic environment, and we are bound by the collective lumen value the given projector can output.

5. We do not yet have a good solution for binding the light-blocking module to the projector – although this likely has a simple solution.

6. Due to the central and elevated nature of both the camera and the projector, light/motion recursion through the camera is very likely to occur unless steps are taken to filter an area – which has its consequences that will result in trade-offs likely too severe to compromise over.

7. The location of the crank has to be outside the field of the camera and projector. It is possible to have the crank located right in the middle, within the camera range but not obstructing the projector light - but this will mean that the user of the crank will be the focus of the image, which may create static results.

8. Due to the environment we are contained in, the projector will have to be turned sideways to establish the amount of vertical space needed to cover the door.


User Issues/Observations

1. Users are drawn to the computer if it is out in the open. They seem to glance at the projected image, then want to look “behind the curtain” at the computer screen. The computer should be hidden from view.

2. Users seem to want to view the projected image by being in front of it, just as people watch television.

3. People stepping out of the elevator (or stepping in) do not particularly enjoy being blasted in the face by the projector light. (this issue has been solved, however).

4. The moment seems fleeting when people interact, people interact and a result is presented, then instantly replaced by a new result. This does promote a sense of real-time, but doesn’t reward


Fig 1.2: Software cutting out background, leaving only the person displayed.


Fig 1.3: Petra using a crank which drives the projected image up and down.

Friday, October 12, 2007

ICM Final Project Proposal: Global Data Visualizer

Proposal:
When developing software, I feel it is important to develop around a system that is flexible, yet powerful when focus is needed on a particular task. Many times, a generic solution to a problem requiring precision results in software that has the ability to handle many tasks, yet does so in a mediocre way. Software should be quiet in interface, elegant in result - and predictable in terms of being intuitive to use.

My commentary on this aspect of software leads me to my proposal -- that being; I want to develop a system which can visualize various types of data as it pertains to our planet, and I'd like to do so in a way which promotes ease of use and flexibility in how it handles different data. Ideally, the data will be presented in a way that will be understood, yet a good deal more 'artsy' than most data visualizers I have seen. Whether 'artsy' means abstraction or simply a clean/slick way of visualizing the data I cannot really say, I very rarely ever know what I will create on the art-side until I am up to my neck in code architecture. That isn't to say that the art is an after-thought, because the fact negating such an idea lies in much of my past code, where I practically destroy my nicely planned out system for the sake of making the output look 'cool'. Of course, I'd prefer to maintain both... but the engineer inside me brings a knife to a gunfight if I have to choose only one due to time restraints.

Technical:
Initially, my task will be to visualize seismic activity happening around the world. This will ideally expand to harbor other data as time permits. I have actually already done this to an extent - but I'd like to have the events be projected on a 3D globe with correct coordinates an so forth. I have made the decision that I will be using OpenGL through Java for this project. I toyed with the idea of using C++ or Python in combination with OpenGL - but C++ coding is still a very slow process for me and Python simply isn't going to provide the optimal speed I need (which is too bad, coding in Python is really enjoyable). I was planning on using JOGL - but the ever helpful Daniel Shiffman pointed me in the direction of LWJGL (Lightweight Java Game Library) which looks really promising, so I will be venturing in that direction.

Here are a few links that are helping me think things through on this project:
Lightweight Java Game Library
NASA Blue Marble
USGS.gov

Tuesday, October 9, 2007

Elevator Project: Motion Tracking

For my group project in Physical Computing, we have decided to address the issue of people waiting outside of elevators and the boredom that inevitably occurs during the wait. Petra, Sunghun, and myself all agreed that a good way to break the static nature of waiting is to allow people to interact with something that has immediate output in correlation with their movements. This allows those waiting to interact quickly with limited effort, thus making the amount of time one actually has to wait for the elevator to arrive non-consequential, which matters a great deal since upon arrival, the elevator could be very close to arriving, or not at all.

My part in the project has been developing the software - using Java/Processing for the graphics and motion tracking system, as well as C coding for the Arduino microprocessor we are using. The motion tracking system does a combination of background subtraction and motion detection, which basically filters out all imagery which not both moving and not within a certain brightness range. The pixels that meet these prerequisites have their coordinates tracked and drawn on using a combination of small rectangles of lines being drawn between each tracked pixel in sequential order, which results in an abstraction of imagery that is vastly different from real-life, yet familiar enough to be predictably interacted with, at least in the regard towards positioning oneself towards making an intended impact on a defined space.


Fig 1.1: Although heavily abstracted, a face emerges through motion.


Fig 1.2: Rapid motion completely destroys all recognizable form, as intended.


Fig 1.3: Slight motion will in turn bring about only slight abstraction, to
the point of nearing a keyed cutout of the form.


Processing:
ProcessingMain
MotionCapture

Arduino:
cb_software