Over winter break I delved into several peripheral parts of my tap shoe project by acting as a consultant on a separate, but similar, project with a friend and college of mine, Treva Wurmfeld. Treva is enrolled in the Media Arts graduate program at Hunter and is headed off to Holland to study the MAX graphical programming language tomorrow. She wanted a prototype of a device to take with her which will eventually lead to a drum installation project she had been wanting to do, but didn't feel she had enough knowledge of software and hardware to implement without help. After talking with her, on Prof. Teller's recommendation it was clear that if I worked with her for to put together a gadget, I would be learning many of the things I need to know in order to complete my own project. So, we agreed to meet twice a week to try to put it together.
Treva's project is called "Beat Down, Hand Around." It is a video art project, of sorts. She wanted two drums to be played. However, while the drums were being played she wanted a video of a dancer to be controlled and updated by the location of the drum hit in real time. Her intention was to show the relationship between the drum and the dancer, and explore the idea that the drum is be both a music generator and an actual physical thing, which we interact with physically. She wanted to map the physicality of the drum with the dancing of a dancer on film.
We decided to use MAX to control the video files.
But, how to determine the location of the drum hit was a bit of a problem.
We though about using capacitance touch sensors and dividing up the relevant zones of the drum with insulation. But this would have made the drum sound horrible.
We considered trying to incorporate a pre-made touch screen or touch pad, but there would be no way to attach the drum to the device and have both objects function - either the drum wouldn't sound like a drum, or the device wouldn't work.
There are pre-made MIDI drums which will tell you the location of the hit, but they were too expensive and not authentic enough for the art piece.
Finally, after consulting a few friends and professors at Hunter, we decided to try to use the sound of the drum hit itself to determine the location of the tap.
Here is a outdated write-up of what we planned to do:
"Beat Down will be constructed with two drums, a microcontroller, eight microphones and a laptop. Each drums will be clearly divided into section or zones which the user can identify. Four microphones will be fastened around the inside rim of the drum. There will be one at the top of the drum, at noon, one to the right, at three o’clock, one to the bottom at six o’clock, and one to the left at nine o’clock. All of these microphones will output sound data into the microcontrollers analog inputs. The chip contained within the MidiTron system will determine the location of a drum beat by monitoring the time it takes for the sound of the beat to reach each of the microphones. The difference in time from when the first microphone on the y-axis received sound as compared to when the second microphone on the y-axis received sound will indicate the y-coordinate of the strike. An identical process with the microphones on the x-axis will indicate this coordinate. The mathematical solution is modeled after the process of triangulation used in the Global Navigation System.
Once the location of the beat has been identified, it will be transmitted via a MIDI signal to the laptop. An application built from the MAX/MSP graphical programming language will then use these two signals from each of the two drums to control the appropriate video clip. Each section or zone of the drum will contain a range of coordinates. A triggering of a particular video clip will occur based on the combination of two zones from each of the two drums. If only one drum is being played, the second drums zone will be given the default value. This video will then be projected onto a larger area. The effect will be that, in real-time, the video of the dancer will depend on where and when these two drums are played."
However, we began to run into problems.
First of all, what kind of interface should be used between the microphones and laptop? Anything we used would need a minimum of 8 audio inputs, and each line would need to be labeled and identifiable from the others, (so a standard mixer was out).
We looked at audio cards, which were too expensive. (Treva's budget for everything was 500.00$)
We looked at professional audio equipment which could translate audio-in lines into MIDI and identify each channel, but this was WAY too expensive.
Finally, we thought maybe we should just try and build the darn thing ourselves, but realized that there would be no time for that before she left.
Finally, we found the
MidiTron!
It looked perfect.
-up to 10 analog inputs
-up to 20 digital inputs
-MIDI translation built in
-CHEAP!
-fast enough sample rate! (or so we thought...)
The speed of sound is: 340.29 meters per second.
In order to distinguish distance within an error margin of 1 centimeter, we would need a sample rate of 340.29 K baud.
But, MidiTron is built with some standard chip from Microchip, a company which provides oscillator support for at least 4 MHz, usually more, so we should have been well in the clear for the sample rates we needed.
We forgot one very important fact.
MIDI can only *transmit* data through a cable at 31,250 baud. This, in turn, effects how much data can be accumulated in the buffer before sampling must stop and wait. Ironically, though the chip itself could easily sample at a rate we needed, it functions much lower, at the MIDI rate. (Oh and by the way, this is with only 1 input! It gets worse with 8!)
Anyhow, this rate would give us an error margin of about a foot, which was totally impractical, since the drum itself is just over a foot long.
So, what to do? We had the MidiTron already.
One Prof. at Hunter, Peter Kirn, recommended video tracking, which Treva may follow up with later. I don't know much about this process and felt overwhelmed by the complexity of it.
Finally, we decided to build a model of the drum with 20 normally open momentary pushbutton switches as input to the MidiTron, which would at least enable Treva to leave with some kind of interactive device, which she could then use as a dummy transducer in order to build her MAX application with.
We put the circuit together, and after some debugging and just in time, the MAX application now receives the appropriate NoteOn and NoteOff MIDI Messages on channels which correlate to each ports' number on the MidiTron.
Treva now leaves for Holland and I get back to my assembly code.
I will never forget the sample limitations of MIDI, ever, ever again.
I'm also now familiar with the MAX IDE and its basic objects, which will come in handy later.