Image Analysis Kinect Visualization Experiments

A few visualizations experiments using image analysis techniques and a Kinect camera. Pixelization algorithm different vectors are applied to properties at each pixel. Painting algorithm layers are extracted and coloured. Delaunay triangulation algorithm, simple triangulation based on feature strength.

Geometric Patternwork Cellular Automaton

This was a quick experiment I did in openframeworks inspired by some Arabic geometric patterns. It uses a hex based cellular automaton with some Conway like rules with two levels of neighbours for weighting. Cells are joined with different rules to create geometry and then tiled. It repopulates or iterates based on music

Pixeled Portraits

Using imagine analysis techniques different data is extracted from an image. (neighbour colour continuity, contour detection, saturation, brightness) The data at each pixel is represented by rectangles (rotation, width, height, transparency). On every iterations a new way of representing the data is created and the viewer watches as the data morphs the image. We see the patterns in the data and it plays off our own recognition of the famous images, portraits where chose to also play on the way our brains recognizes faces in anything.

AS3 Step Sequencer Plus

This is a sequencer I built in AS3, using the sion synthisis library. Apart from your normal grid based sequencer it actually has some cool features. Four instruments each intruments has 4 patterns, you can select a pattern to edit with the green buttons and you can arrange how the patterns play by toggling the blue buttons. There's four synced loops that are grouped into two cue-able tracks.

Apart from the loops, the sound is create entirely with real time synthesis in actionscript so it's quite cpu intensive. Click thumb to play around.

Algorithmic Music Generators

Here is a collection of VSTs I've made to create midi based on some modularized algorithms I've discovered found over the years. All modules weren't written in c++ dll to be used with sythedit for fast prototyping and easy GUI design (although ugly). They're frankenstein creations and I have tons of them for different effect, but the general idea behind was to trigger algorithm for length and timing (triggerizer), a pitch algorithm (pitchinator) and expressive algorithm that does velocity and expression (velocifier).

The drummer is based solely on the morse thue algorithm and does even odd tests with interesting results.

You'll need a host that can route midi. I recomend energyXT it's fully modular and can be run as a vst.

These are very early builds they have some bugs, it comes with no documentation or support, only works on windows, and I am not responsible in any way for what it may do, etc. If you're still keen please download them, try them out, tell me what you think, save patches and send them to me. I started worked on a uber version that can be customized but I eventually abandoned it.

Screen Shot

Drummer Screen Shot
Audio Example:
VST Package

3D MIDI Visualizations System

This started as an interactive MIDI demo I did as a graphics project. The idea was to build something to accompany a live performance, so that people have some way of seeing the midi inputs there subsequent audio result, performance element in a laptop performances is definetly a low point.

It has evolved into more of a framework for 3D visualization. It aims to allow programmers to add new visual instruments easily with a plug in system and reusable objects and temlates, and for artists to create complex animated scenes with the XML based modular animation system. It uses modern shader driven rendering engine (ogre3d) and imports 3d scenes from most 3d modelling software.

It's open source under the LGPL and code and more technical information can be found on at the sourceforge project here. I've only really put time into code and not into visuals. I'm looking for interested people for art. If you would like to be involved in any capacity please e-mail me, you're involvement would be really appreciated.

I'll have a proper demo of it's capabilities up as soon as I have a chunk of time.

Various animated boxes

and a sphere
[Relase 0.1]
Please use the cvs to download source

The original system was built on 3D engine I wrote but never finished. The engines main feature was a robust particle system, and was written in OpenGL.

Older version features tunnel pulser, particle emitter (Rays, Shapes, Sprites, Etc.), and morphing polytopes visualization instruments.

Sphere preset

Tunnel preset

Console Mode


Game of Life Midi Generator

The game of life isn't really a game, it's a cellular automaton, but it's still fun. Devised by John Horton Conway in 1970, it is the best-known example of artificial life, back in the 70s this was everywhere people conceived of entire computer systems built on it. Basically it's a set of rules that determines if a cell in a 2d grid lives or dies in the next generation. It show emergent and self organizing behaviors which are cool to watch.

I turned the game of life into a step sequencer, a cell is a note (pitch by time), and when a sequence is complete, it runs the game of life rules.

Organized patterns dissolve to randomness, and random patterns organize into stable states or die. Which you can hear (but not see) in the example I made.

I wrote this in c#, and used a midi wrapper written by Leslie Sanford.

Screen Shot
Audio Example:
Lead, Bass and Beats


This was a short lived video game experiment I put together. The idea behind it is a multiplayer competitive shmup. The primary purpose of this was to evaluate a bunch of different pieces of game technology, so this more of an assembled toolset then a game. It's data driven so people can modify character movement and make they're own barrage patterns. The game parses BulletML files, there is a database of barrage patterns here

This game is in need of an AI opponent, as it stands it is only really playable with two players, button layouts are found in the readme file and it supports two joysticks

Like most stuff here this is simply a proof of concept, it has taken back seat to a few other projects and work.

This game uses (not all of which are featured in this build):

  • Ogre3D - Powerful 3d rendering engine

  • BulletML - Bullet barrage markup language for 2D shmups. What's a shmup?

  • OIS - Input System that supports force feedback.

  • OpenAL - Sound API

  • Newton - Physics engine with collision detection.

  • Lua - Scripting

If you would like to help contact me.

Screen Shot

Simultaneous localization and mapping algorithm for mars exploration

SLAM algorithms are mapping algorithms that make it is possible for an autonomous vehicle to start in an unknown location in an unknown environment and then to incrementally build a map of the environment while simultaneously using this map for exploration.

SLAM algorithms are primarily designed for indoor enviornments and run on robots with many sensors. Our mars solution had a much simpler and thus much less expensive robot with only one sensor on a pivoting arm. We also used a fuzzier terrain feature searching algorithm, and desined to be extended to do coordinate parallel exploration.

The algorithm used a generalized voronoi graph, like ones discussed in Topological Simultaneous Localization and Mapping: Toward Exact Localization Without Explicit Localization; 2001 by Keiji Nagatani, Howie Choset. My professor was Zibigniew Stachniak, AI head at York University

Full render

Debug Mode


Neural nets to control a virtual plane through a 2D tunnel

The problem called "english patient pilot problem" was flying through a randomly generated caves without crashing. The neural net pilot could only steer up and steer down (hard or light). I used a backpropagation neural net and some cheep OpenGL graphics for the nifty visualization of the tunnel.

Download has both a pretty robust back propagation neural net and the Pilot program.

Watch out for those stalagmites