A Sorting Algorithm done in the style of 2001: A Space Odyssey
Built in Processing.

Slitscan Experiment

August 17, 2019

For a bit of fun, I thought I’d try and recreate the Stargate sequence from 2001: A Space Odyssey, using only code and algorithms (using the Processing programming language).

The first task was understanding Douglass Trumbull’s complex Slitscan camera rig.

“Using a technique of image scanning as used in scientific and industrial photography, this device could produce two seemingly infinite planes of exposure while holding depth-of-field from a distance of fifteen feet to one and one-half inches from the lens at an aperture of F/1.8 with exposures of approximately one minute per frame using a standard 65mm Mitchell camera.”
Douglas Trumbull.

I tried to faithfully recreate this setup as close as possible, here’s show the slit actually appears on camera. The artwork (moving horizontally) is illuminated though this slit while the camera dollies forwards with an open shutter creating a single frame / long exposure. This process is repeated again and again to create a sequence.

The next task was to create the artwork projected through the slot.
What’s unique about how I can do this, is using generative art. Tumbull used collages of coloured gels and transparencies made up of blueprints, photographs of random geometry and magzane cutouts.
What caught my attention was the very opening shot were it clearly looks like blueprint styles of artwork was used.
I thought this a perfect opportunity to use something like recursive subdivision to generate this style.

Recursive subdivision.

With a little tweaking, layering, and some other random effects, heres the artwork my code generated, which resembles the artwork used for two of the shots from the first part of the stargate sequence.




And here’s the final results from my video…




For comparison here’s the original two shots from the actual movie..

Neural Fractal

September 15, 2018

Neural artistic style transfer applied to a computer animated fractal.

Neuralism

September 4, 2018

Animation created using artificial intelligence and deep learning in a technique called ‘artistic style transfer’.

Neural models of human visual perception are used to transfer the visual style of a painting or photograph onto another image.

In this video I’m taking visual styles such as computer fractals, abstract photography, sci fi art and HD wallpapers and transferring them onto repeating GIF loops – which are originally just simple 3D animations with no texture or color.

Everything was rendered from open source code in Google’s Colab using their GPU runtime support. GPU processing power is an essential requirement in all deep learning projects.

Artistic style transfer part of an exciting new branch of AI based art, in what could tentatively be called ‘neuralism’.

Animation & Music by Glenn Marshall

My 360 VR film ‘Another World’ is now available on Vimeo 360 – As usual it works with every browser except Safari. ┬áCome on Apple!

The landscapes in the film were all procedurally generated from algorithms, using OpenGL GLSL shaders – all computed on the graphics card GPU.