Slitscan Experiment

August 17, 2019

For a bit of fun, I thought I’d try and recreate the Stargate sequence from 2001: A Space Odyssey, using only code and algorithms (using the Processing programming language).

The first task was understanding Douglass Trumbull’s complex Slitscan camera rig.

“Using a technique of image scanning as used in scientific and industrial photography, this device could produce two seemingly infinite planes of exposure while holding depth-of-field from a distance of fifteen feet to one and one-half inches from the lens at an aperture of F/1.8 with exposures of approximately one minute per frame using a standard 65mm Mitchell camera.”
Douglas Trumbull.

I tried to faithfully recreate this setup as close as possible, here’s show the slit actually appears on camera. The artwork (moving horizontally) is illuminated though this slit while the camera dollies forwards with an open shutter creating a single frame / long exposure. This process is repeated again and again to create a sequence.

The next task was to create the artwork projected through the slot.
What’s unique about how I can do this, is using generative art. Tumbull used collages of coloured gels and transparencies made up of blueprints, photographs of random geometry and magzane cutouts.
What caught my attention was the very opening shot were it clearly looks like blueprint styles of artwork was used.
I thought this a perfect opportunity to use something like recursive subdivision to generate this style.

Recursive subdivision.

With a little tweaking, layering, and some other random effects, heres the artwork my code generated, which resembles the artwork used for two of the shots from the first part of the stargate sequence.

And here’s the final results from my video…

For comparison here’s the original two shots from the actual movie..

Making an Algorithm

June 15, 2019

I recently decided to go back to what I enjoy doing most, and probably what I’m best at – generative / algorithmic animation.

So I started by looking at this – a stock photo I found on the internet..

So I thought – how could I generate this kind of image purely from code?
Well about a month later – coding in Processing, I ended up with this..

How did I get there? Well here’s a very quick photo journal…

I first started with creating basic bezier lines. You’ll see I’m always working with the original stock photo on the left on my iMac as reference.

Then simple random shading.


Spatial distribution.

Color gradients.

Fine tuning.

Fun play by tweaking parameters.

Shadow effects.

System finished, and with a bit more tuning I would end up with my final result seen above.

Here’s some of the parameters I can play around with.

Which allowed me to create these.

But this is just the beginning.
My next phase is to create new works in 4K. I’ll be starting a new YouTube channel dedicated to this.
You’ll see from the pics that I’m coding on my 55″ 4K HDR screen. I use my iMac just a reference / second monitor now.
Coding on a big screen is as experiential and cinematic as watching back the results 🙂

The animation possibilities of my new algorithm, combined with the visual impact of 4k, is now the focus of my next works.

Here are some teasers in this resolution.

Make Art with Data

June 12, 2018

more from me on this project soon…

Another World 360

April 12, 2016

Here’s the link to my Facebook post on this..



January 31, 2016

Continuing on with my work based around shaders and panoramic projection (read the making of ‘Temples’ ).  In this this video, rather than animate the camera / spherical projection manually in After Effects, I made the camera purely random and generative, by adding noise() expressions to latitude, longitude, rotation and zoom.

Here’s a still of a panoramic frame..



also, here’s a final frame from the video – it’s more to show the detail unfortunately lost in Vimeo’s compression.  There’s a lot of fine, fast moving detail which is sacrificed.

Screen Shot 2016-01-31 at 14.10.51