When combining different robo-mechanic instruments and let them jam together, the most crucial point (beside from making awesome music) is to set the latencies right. The setup is mostly the same: we have a robotic music instrument (e.g. a snare with an solenoid actor), a somewhat electric power unit (Relay, Motordriver / FET-drive and an Arduino) and a unit to generate the information, e.g. keyboard or a midi-program like Ableton.
Each instrument has its own unique latency, given by the time between the “note on” signal (e.g. Ableton triggering a note in the sequencer), and the time when the tone finally leaves the physical instrument. These latencies can be up to several 100 microseconds, a delay that a human ear will definitely recognize. The latency consists of the time for the physical path that the actor has to resolve to get to the instrument and the electrical / digital delay inside the electronics and software. In fact it does not matter where exactly the delay comes from, we will just measure the whole beast and work on it.
Controlling the latency
Looking at our mechanic drumbot MR-808 we have eleven mechanic drum instruments that we have to equalize appropriate. The instruments are controlled via Midi, and the drum instruments shall be controlled from one channel in Ableton so that they can be programmed like a standard drum machine. This means we have one Midi instrument on one midi channel and the different drum sound are laying on different notes.
The main idea of latency control is to delay all instruments together with one negative value, so that they are all together being played before the main clock. Then we go ahead and adjust every instrument with a positive delay value, so that finally all instruments are just in time.
a) In the graphic you can see how different instruments (in this case bass drum, snare and hi-hat from the MR808) are set up. In the first grafic the midi signal is in time with the main sync clock, and nothing is delayed. The snare will be 50ms late, the bass drum has the biggest latency of 90ms, and the hi-hat is quite tight beeing audibible only 10ms after the midi note is triggered.
b) Next, we shift the whole track to the left by setting a negative value to the whole track. In Ableton there is this cute “channel delay” inputbox underneath each track (in pattern view). We choose the maximum latency value from our instruments (here: BD with 90ms). HH and snare are not in time yet, but Bd is.
c) In the last step we apply a unique delay of (max_latency) minus (instr_latency) to every instrument, except the one with the maximum latency of course. This pushes the instruments to play back just in time.
Of course this technique can be applied to other DAWs or systems as well. You could also implement this into a FPGA, into an arduino, or even do the latency control with discrete circuits (Monoflop)!
The measuring can be done quite easily using any DAW of your choice, that can playback midi and record audio. Here we use Ableton live again. As mentioned above, we do not care where the latency comes from – we just measure and fix it. So all we have to do is measure the time between the first (MIDI) signal and the final audio signal.
To do so we hook up a microphone to our DAW, set up a record channel to record audio and play a midi signal (preferably one programmed midi note that is just in time with the clock). After recording one beat, we zoom deep into the recorded material having the midi note aside, so that we can measure the derivation of the audio signal from the digital trigger. This is our unique instrument latency. And then: <loop>Do this for every instrument</loop>
Note: We have some error in measurement while recording audio that the above mentioned method does not take into account. The more accurate way was to set up a system where the midi signal triggers both the mechanic instrument and an audio sound signal and we route this signals to a second computer and record and compare the two audio streams there.
I implemented the whole latency control in Max / MSP, demultiplexing the midi stream from ableton into single notes, stripping the on-notes, delaying the single notes appropriately, and finally applying a fixed duration to the notes. Its a quick and dirty hacked together MM-patch, that we might present when its a little more tidied up.
I cleaned up the code, so here you can see a screenshot of the main MAX/MSP controlling unit.
Sciencefictionchildren: MAX/ MSP latency control. A single chain view to control the delay, lighting, etc. of each drum instrument.[/caption]