I finally edited the video footage I shot of this past year’s Vivarium installation at the Southern California Institute of Architecture. The Vivarium is an installation by Matter Management exploring biology, technology and mythology in a complex architectural and multi-media environment. I developed the sound design for the project, which involved inventing a system to monitor the biology in the black pyramid and to use that data to both generate and modify audio signals. The sound included audio amplified from the biology itself, sounds I had previously recorded, and sounds generated by software using the data captured. I developed the data system using Max/MSP.
We went over to Sci-Arc last week to check out the current project up in their gallery space, Lenticularis by Hitoshi Abe. The project is a large scale mock-up of a proposal for the Japanese American Cultural and Community Center plaza in Little Tokyo, originally designed by Isamu Noguchi.
Based on a particular type of cloud formation that sometimes appears over mountains (I have seen some great examples out in Anza-Borrego State park), the quasi-functional sculptural object will span over the plaza, providing some shade, some views of a curious object from the street, and reflecting both plaza and sky in the middle. Checking out the 1:7 scale version at Sci-Arc, we couldn’t help but see strong similarities to the work of sculptor Anish Kapoor, and wonder how it will serve to mitigate the summer heat of the plaza, or shelter it from winter rains (part of the brochure from Sci-Arc describes it as a ‘roof’ which responds to the plaza being “too exposed to the climate of Los Angeles”). I am pretty curious as to how it would come off at full scale, crouching over the plaza, and a bit skeptical of it all as anything but a beautiful sculptural object. This unveiling also served as a presentation to the clients, so it remains to be seen whether they are sold on it in this form – if so, I might be able to check it out there myself in a little over a years time.
This Friday April 9th at 7pm is the Vivarium reception and talk. Come check out the beast in person. Its at Sci-Arc’s gallery in downtown Los Angeles.
Juan Azulay of Matter Management will discuss the project with Eric Owen Moss, director of SCI-Arc. That should definitely prove to be interesting.
Also got the live feed running again, so check in on that while its still up! The Live broadcast link is over in the right hand menu bar >>>>
See posts below for more information than you probably need about the whole thing.
Oh, and to whet your appetite, here is a recent interview with Juan from ReVista magazine. Its part of their larger story covering the The Argentinean New Wave.
I set up a live broadcast from inside the Vivarium, with sound coming straight out of the system and low-res video of the biology scaffolding and interior structure. So, check in on it, at times it will be live, other times it will be playing video and sound recorded from another day. Use the blue Live Broadcast link on the right hand panel, or just click here: Vivarium Live Broadcast.
For a long time now, I have had a habit, or method, of laying out my audio/installation/performance projects by drawing simple, yet detailed, diagrams of all the parts.
These diagrams would help me to organize the signal flow, parts list and layout for the projects. I would actually use them to figure out how many cables to buy, with what connections, and so on. Often I would start with a looser version, that helps to lay out the conceptual parts of the work, and how the parts are related, which I refine a few times until I am ready to draw every little cord, element, and plug.
For the performances of Public/Private and Local Music I scanned and cleaned up my diagrams and included them in booklets I made for each show. Here is one from Public/Private:
This little diagramming habit got me curious about other sound diagrams, and I dug up some interesting ones out there, including this gem from Brian Eno, showing how his analogue infinite tape loop system for Discreet Music worked.
There are also tons of people around the internet either posting their sound rigs, or diagramming bands set ups, so you can finally find out what kind of hardware they are using, in what configuration, to get that specific sound.
Here is a simple diagram of a guitar rig (found here):
Here is an example of a pedal-board layout, used as a guide for Ronnie Cramer here to build a flight-case mounted effects board.
While these diagrams are interesting, they are merely graphic representations of the arrangement and connections of the tools of some musicians. They borrow the logic of the circuit diagram, long used to draw out and conceptually test circuits prior to actual constructing them, but keep none of the symbols. It is actually in the symbols that the circuit diagram really gets useful – the ability of the drawing to represent the functions of physical objects in such a precise way that you can actually trouble shoot your circuit from the drawing. I am interested in these properties of the diagram, and the possibilities of using the diagram to structure sound in a more direct way.
The Voice of Saturn synthesizer (used in video below) schematic
Several months ago I got into a conversation with Juan Azulay of Matter Management about Moog synthesizer wiring. I think he had just posted looking for someone who knew how to wire a Moog (I do not), and I responded with some video of a (much simpler) kit synthesizer that I had built.
The legendary Moog
My synth and delay setup
This short exchange led to my joining his Vivarium team as sound designer. For this project, I was vaguely tasked with creating a hybrid bio-electronic synthesizer which would take sensor input from a collection of living organisms and their support systems (light, heat, water), merge them with a set of software based systems, and output sound which was responsive to changes in both the living organisms, the support systems, and the software systems.
The sound system was to work in parallel with a video system of even greater complexity, created by a media team headed up by Doug Wiganowske. It will be taking input from cameras, feeding that to a series of virtual organisms (built as an evolving software construct by Nicholas Pisca), and merging all of that with video shot during the whole process of making the Vivarium. This is all finally output to a set of monitors in the media field of the final installation.
I began the sound design process by diagramming inputs, processes, and relationships that could be set up within this system, based on an assumed list of organisms and support systems. At the same time I began to search for sensors that could take the data we wanted, and translate it to MIDI so I could use it to work with the audio and data signals within the software. At this point I was worried I would have to build these sensors and processors by hand, and was looking to side-step that long process. I also started research into what software would be best for the set up.
The media and sound teams then collaboratively worked out a diagram of all the media for the installation, as a framework from which to develop our systems.
I found a couple of patch-based software packages that would be appropriate for the project, and began working with one of them, Audio Mulch, to develop test patches.
From the Audio Mulch website:
AudioMulch is an interactive musician’s environment for PC and Mac. It is used for live electronic music performance, composition and sound design.
AudioMulch allows you to make music by patching together a range of sound producing and processing modules.
I also found a source for the sensors I needed, an ordered a few so that I could test my input devices with my software patches. The result of this first test (using some of my sounds and a short sample from the band Double).
In Audio Mulch, patches are created by objects dragging onto a “patcher” area and connecting them with patch cords. The objects themselves are chosen from a list of various types of audio handlers, generators, or processors. Once a patch has been assembled, in flow-chart diagram fashion where you can actually follow the path the signal takes through the patch cords, then adjustments can be made to each element in the editor panel beside the patcher. The power of this program for me was that each object was able to have midi input assigned to adjust any of its parameters, enabling sensors to be to control almost any element of the sound. Also, the complexity and fidelity of the available objects was quite impressive.
Here, the diagram has become the instrument and sound generator itself, and as I constructed patch diagrams, I was building the software synthesizer the would generate sound from my array of sensors.
After working within this system for many weeks, Juan and the team suggested I look into using MAX/MSP to build my patches. MAX/MSP is also a patch based, flow-chart like software tool, but in contrast to Audio Mulch’s small set of fixed audio objects, MAX/MSP is simply a visual programming environment of limitless application. From the MAX website:
An interactive graphical programming environment for music, audio, and media. Max is the graphical programming environment that provides user interface, timing, communications, and MIDI support. MSP adds on real-time audio synthesis and DSP (digital signal processing), and Jitter extends Max with video and matrix data processing.
While this would open up the sound system to many new possibilities, it would require that I learn a whole programming syntax – quite a bit complex than just using new software with a simple user interface. I set about going through tutorials, taking apart demonstration patches, and building simple sound elements to test what I could and couldn’t learn to do within the time frame of the project.
Unlike using Audio Mulch, configuring the sensors to work with MAX/MSP was a challenge at first since it required unraveling the syntax of the sensor manufacturer’s proprietary MAX objects. Once I had figured out all the tricks to get MAX and the sensors to talk, I made a simple light driven MIDI piano patch. You can see in this short video how casting shadows on the sensor will affect the simple MIDI piano sounds being generated randomly through the software.
With the sensors now talking to the software, I compiled an array of individual MAX patches, one for each type of sound or effect I wanted to include in the final sound system. Here I was limited a bit by my new knowledge of MAX, and will continue to refine and add to these patches throughout the duration of the installation. The complexity of the MAX system as compared to my previous Audio Mulch is system is more additive – building many simple elements into a large patch rather then building each element to create more complex sounds.
The modules in the above patch are color-coded by type, and each separated into boxes for clarity. Below them all of the individual audio channels are run into a mixer made up of individual faders and volume displays, then mixed down to the two speaker channels. In the final patch I added a filter at the end of each channel to guard against damaging low frequencies. I also had some help here from Michael Feldman in getting the patches to do what I wanted.
The patch at this point consisted of the following modules:
Stereo file player with sensor controlled pitch (on each stereo channel) and speed
Stereo file player with sensor controlled phasing and delay effects
Microphone input 1 with sensor controlled filter, phasing and delay effects
Microphone input 2 with sensor controlled filter, phasing and delay effects
A chorus of cricket sounds, each with sensor controlled speed (to replicate the acutual crickets that will be in the Vivarium)
A minor chord synthesizer, with the root note created by sensor data, with sensor controlled octave switch and filters
A frequency modulation synthesizer driver by sensor data
And two simple tone generators driven by sensor data
I ran a studio test, using the sensors I had available and the ambient conditions of my loft to control the patch. In the real installation, there is be an array of eight sensors, placed among the biology inside the Vivarium to control the patch.
Last week the final sensors arrived, I made the necessary tweaks to the patch, and spent several days installing the whole system while the Vivarium was being completed around me.
The Vivarium officially opened on March 26th with a small SCI-Arc reception, but over the next two weeks we will continue to refine the systems on site, getting everything optimized for a public reception and talk (between Matter Management’s Juan Azulay and SCI-Arc’s director Eric Owen Moss) on April 9th. During this time, I will also be working on getting the whole sound system to broadcast live over the web.
Matter Management’s Vivarium Installation is currently on display at SCI-Arc‘s Gallery.
There is a great critique of the recent work of local Los Angeles architectural institution SCI-Arc (the Southern California Institute of Architecture, my alma mater) over at Drowning in Culture. Its a critique I didn’t bother to write, though I had considered it Sunday night, as I felt that my heaping more negativity on the place wasn’t going to do anyone any good. Luckily, over at DIC they are a bit more thoughtful and well-spoken than I and make some excellent points about the direction of work at SCI-Arc as well as contextualizing the situation they have gotten into a bit. They also don’t really pull any punches or shy away from placing blame either.
Over the past few years the work has been exceptionally questionable in both its mission and educational scope, leaving behind any kind of critical discourse in favor of the droll world of affect and computational representation…
When architecture reaches this level of mindless digital twiddling it is no longer playing any productive role in the development of modern society and is leaving itself to be exploited purely as a slave to capital…
There isn’t much hope for the school so as long as it’s director, Eric Owen Moss is still at the helm. Moss has made a mockery of the long-touted revolutionary philosophy once present at the school by first alienating it’s more competent instructors to the point where all but the most dedicated faculty left for other institutions, and then by appointing a succession of supplicants who appear unwilling to make provocative decisions with regard to its curriculum.
While I was again disappointed by the Graduate level thesis presentations, I also attended the undergrad level thesis earlier this year, and at the time was actually impressed with the fact that those students were proposing viable buildings and delivering plans, sections, and models actually descriptive of the architectural features of the project at hand. The work was well crafted to boot. The trend toward digital/computational form and embellishment was still there, but it was starting to be applied towards the creation of spaces that had a semblance of use, structure, and service reminiscent of the requirements of building. I am not sure what is generating this schism between the two programs (where their end products used to be indistinguishable from each other) but perhaps those mentoring the graduates should take note of their peers ability to develop more holistic proposals, moving beyond the merely digital and bringing it back towards the physical realities of building.