iPhone Development #3 – ‘Zio’

June 29, 2009

Well here I am about 6 weeks into developing my first iPhone app based on my generative animation system.

It’s been an enormously steep learning curve, even though coming from a background in Java/Processing, of which I would be fairly proficient.  This has been tough.

I would say I’m at the half way stage now, and after doubting almost daily the project and my ability to realise it, for the first time I feel confident that I can actually pull this off and come up with something half decent on the iPhone.

The direction I’m taking for my first app will be based on the ‘Music is Math’ animation style, semi-abstract, in black and white – roots and vines growing endlessly in real-time.  It will be interactive, respond to touch,orientation etc, have randomising features and maybe a few other tricks.

The last few weeks have been spent mostly focused on creating an optimised real time graphic engine which is perfect for my style of algorithmic animation.  My main concern was trying to reduce the amount of math computation required.  The amount of polygons hasn’t been an issue, which I thought it would be, but the algorithms were killing the CPU and FPS.  My main solution so far was to replace the smallest most stems with bitmaps, which doesn’t affect things visually, and gave me an enormous speed increase.

But then having too many bitmaps became an issue, so I’ve implemented the fasted texturing routines I can muster, which uses PVRTC copmressed images – streamlined image formats best for iPhone, and also mipmapping these, which reduces the efforts needed for the graphic chip to rescale texures at different distances.

I’ve also been using glDrawelements calls to draw all the quads with their bitmaps.  The main body of the ‘snake’ and the first ‘shoots’ that come out are mathematically generated polygon strips created with glDrawarrays.  Other little coding hacks I’ve been using which differ from my original Processing/Java implementation, are things such as disabling depth testing all together, and just drawing my layers in order from back to front.  I’ve also culled backface rendering, disabled all gl lighting, and a few other things.

The demo runs smoothly at 24 fps, it can run a lot faster, but I need headroom to implement the rest of the visuals to be added – to be included next will be the shading/shadow algoritms (more bitmaps), the sprites/particles flying around through a perlin noise field, and also I can hopefully have a duplicate layer of the growing vines in the background, as is common in all my videos to date using this system.

So on we go, I feel inspired again after a tough few weeks.  I want to create something enigmatic, sublime.  Like my mum said when I showed it to her, it’s peaceful to watch, like an aquarium… which is a very good comparison.

But it also has to work, and run smoothly using the technology.  This has been the challenge, not a creative challenge, but technical.

With Zio I hope to create the first of a series of generative animation apps for the iPhone, with Metamorphosis following soon after, culminating with a pyrotechnic audio reactive app.

About these ads

16 Responses to “iPhone Development #3 – ‘Zio’”

  1. Glenn, which version of the iPhone are you developing on? I am curious if the new OpenGL ES 2.0 and faster CPU on the 3Gs would minimize the amount of hacks/optimization you are being forced to do in your app.

    Super excited to see you release on the app store, and thanks for the posts demonstrating your commitment to get through an arduous learning curve to realize your vision, very impressive.

  2. Thanks Tom
    I’m not getting worried about 3gs yet, as there are about 50 million iPhone/Touch users with opengl es 1.1 to cater for. And I just read that only 1% of touch users have upgraded to os 3.0!
    But apparently everything will run faster on the 3gs anyway, even 1.1 apps, so at least thats something. If things take off with this I’ll start harnessing opengl es 2.0 for sure to get the max out of my graphic concepts.

  3. This is awesome. I am trying to imagine how you could add more user & multiuser interaction. I’m picturing an experience where two people can customize the “breeding” of their Zio. Perhaps the user can have some agency over their Zio color, add their own background texture, and perhaps choose a few “genes.

    Now, imagine two users approach each other with their Zios growing. Perhaps they can cross-pollinate. Some of the Zio genes are transfered over Bluetooth to nearby Zios, affecting their genetic makeup.

    I know it’s a goofy idea, but I think it’d be interesting to implement.

  4. Pito said

    Very Cool, your mom couldn’t describe better, i’m very curious about the exploration of this area in the iphone. thumbs up!

  5. Michael, my problem is not coming up with ideas, it’s to stop coming up with ideas! There’s only one of me and there’s so many possibilities (and a lot of work involved). But I like your thoughts. I’m hoping Zio, will evolve organically as a piece of software art once I have the initial ‘seed’ program developed.

  6. madhouse6 said

    very cool/ / very beautiful. i’d get this one.

  7. KGT said

    Very, very cool. The music? Will it be the same for the app?

  8. Dijanara said

    I´m really looking forward to this ap. Great Art, fascinating effects. Thank you!

    Especially looking forward, because of the great gap of music visualisation in the ap-store. (If anyone has other informations – please tell me ;-) )
    Will it be able to interact with the I-Pod function on the phone?

  9. Regarding music, it will come with a pre-made looping ambient track I’ve already composed from my back catalog (I’ve done a lot of electronic music compositions for film & tv). However this can be disabled and can you can play whatever music you wish from your iPod/stored music.

    It won’t react to audio specifically at this stage, however I’ve spent a lot of time refining my animation algorithm so that it ‘looks’ like it is always reacting to music of all kinds, at a cerebral, internal level. You can adjust the speed to fit the tempo of music as well.

  10. Glenn, if you were starting from scratch, would you still start in Processing to do your first “sketches” and then port to iPhone after you finalized your concept, or would you start directly in Objective C and XCode?

  11. Dijanara said

    first: I´m really a noob.
    Just a stupid idea: Perhaps the algorythms from “spawn illuminati” can help you?
    It reacts to the microphone.
    Hope, there is a way to connect these guys =) (if it helps)

  12. Tom – I would probably still use Processing to prototype code quickly and freely – as Xcoding is a pain.

  13. Reacting to microphone won’t be to hard to figure out, just need to get the main program built before adding features.

  14. musicsite said

    Я слишком больна тобой
    Я не тороплюсь домой
    Небо роняет звезды как слезы

  15. [...] is a teaser trailer for Glenn Marshall’s iPhone music visualization [...]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 55 other followers

%d bloggers like this: