A new academic year and a new module, the last one before the final masters piece which makes it all the more important to choose wisely. So far I have been thinking a lot about consolidating the installation experience that I gained from the last module (self-negotiated unit). I have lots of ideas – for example the use of large-scale projection screens suspended above the viewer to create an illusion of cathedral-like space using geometric form. I also want to introduce interactivity. I also want to return to videography (which I have not visited since the first module – research into practice) in order to evaluate whether I will use videography or computer-generated imagery or a combination of both in the final project. I also want to evaluate the effectiveness of Quartz Composer over Processing as a compositional/performance tool.
It’s been an intense decision-making process that has come to a head over the last few days as I have discovered some really impressive user-interaction patches that run on Quartz Composer and lend themselves straight away to the interactive installation space. For example, the Rutt-Etra patch available from v002 which applies the video synthesis logic developed by Rutt and Etra back in the 1970’s. Here’s a grab of myself simply looking into the webcam, the resulting image mapped to 3D lines.
There’s another Quartz Composer patch ‘Pulse Multitouch’ which doesn’t look like much but is beautiful in its simplicity – a multi-touch interface that ‘shines a light through a fog’ corresponding to touch position. It doesn’t sound particularly inspiring but it’s a delight to play with as it invites you to place all 5 digits of one hand upon a track pad and make pretty patterns.
However, despite the attraction of the interactive-rich Quartz Composer patches mentioned above, I have decided to turn the other way – this new module being the principle taught moving image element of the course, I thought I would take the videography development as a priority and work some of the other elements into private study and possibly a separate installation event late this year or early next. I can, of course, return to them for the final project in any case.
I am genuinely interested in videography and although not likely to make a film as most people think of them, I do enjoy thinking up and setting up shots – generally with some ulterior purpose in mind such as realtime manipulation in a platform such as Quartz Composer.
My current thinking, which I’ve also used to develop the learning contract that goes with this module, is to look at time-based phenomena such as ‘slit scan’ which I intend to manipulate in realtime using MIDI/audio waveform data.
The following 2 screen shots are taken from a (not very well-shot) video of a flower waving around in a strong breeze. The ‘slices’ are actually segments of frames from different time positions in the video so that the current frame is a mix of present and past. In the prototype composition used, the segment sizes and positions are also linked to audio input.
In fact, the wild movement of the flower and the slightly over-dramatic use of zoom, both factors which would otherwise limit the usefulness of the clip, in this case provide the kind of dynamic change that makes the slit scan-type effect work. At present I have only a beginner’s knowledge of Quartz Composer, but I’m going to have to dig quite deep to be able to recreate this kind of effect myself, without using a plugin, in order that I can use the resulting image segments exactly as I please, without being limited by someone else’s plugin.