MR-808 Interactive Technique
The MR-808 interactive is a robot installation featuring an oversized robotic drum machine – programmed live by the audience. The audience can program the drum robot in a collaborative process and listen to and see the emerging of the sound. At the exhibition site, the audience can stand in front of the robot, accessing the music structures via interfaces.
We got another article describing the acoustic, electronic and mechanic construction of the robot itself. This article focuses on the interactive part.
- 1 Concept for the interactive drum robot
- 2 Content
- 3 Lessons learned & Outlook
Concept for the interactive drum robot
After we build the MR-808 Drum robot in 2012, we used it as a musical instrument at performances for a while. It was clear quite quickly that people were thrilled to see and use it on their own, so we started to think about an user input for the robot. The idea was to set the installation up in public and let everyone control it on their own. A social experiment in music, crowd-composing, collaborative robot drumming!
So this was our plan after a few nights of thinking:
- Multiple user input, beats composed in together and played in near-realtime
- Single midi stream output (to the robot, which speaks midi)
- Need some kind of physical interfaces
- “Step sequencer” as input structure
First we cared for the hardware. The general setup was set up quite early: a RasperryPi as a Server, some kind of mobile device as clients, a midi interface as an output and a router for the WIFI.
After some experimentation the server setup was finished: we set up a RaspberryPi B, dismantled a WRT45GL Router, and connected a simple M-audio USB midi Interface. (It was obvious that we needed Midi Out, because the Robot is controlled via MIDI – every instrument like Snare, BD or Hihat is related to a midi note. When the note is send, the instrument is triggered, e.g. the corresponding solenoid slaps onto the drum).
As the whole installation would be assembled and disassembled many times for exhibitions and festivals we tried to fix everything that belongs to the server in one case. We removed the housings of all devices (router, midi interface).
.. and we hard-wired everything to one power supply and put it into a case.
So thats was it – a RaspberryPi Server running a midi sequencer, outputing the Midi over a midi-usb card and a Wifi Hotspot so the clients can connect. Time to care for the tablets!
First we thought that we would provide a website, let the audience connect with their own phones and control everything by everyone. But that bears some problems! When testing the input with many devices (divers phones & tablets) we realized that the input of more than two or three devices quickly results in a cacophony of random beats. Some people tend to test public interactive art to the limit. “Lets press ALL the buttons – at once.”
So we skipped the idea of letting the public contribute with an unlimited number of own input devices and decided to put only two interfaces (tablets) in front of the installation, which everyone could use.
As the client interfaces we had several options with pros and cons:
- Ipad (high quality, but expensive)
- Samsung Galaxy
- Google Nexus 1 (cheap, but bad charging behaviour)
- Google Nexus 2
Google Nexus 7 tablet as an interface
After testing all the tablets, we chose the nexus 7 2. Gen because its so cheap (just 100€-150€). There are several issue with installations which have a constantly available touchscreen interface and are exposed in public. (We also wrote an detailed article about how to set up a tablet in Kiosk Mode for installations.)
Firstly one has to restrict the public user from accessing the setting, WLAN and things that are not crucial but just confusing. Basically you want just one web page to be displayed where no one can break out. On the other hand you have times when you want to do maintenance, shutting down or booting, connecting to a lost WIFI and so on. We decided to go the hard way and cover the “home” buttons of the tablet with the case, where its installed at. So the audience physically can’t push the home button on the bottom of the screen which would bring you to the main menu.
Power and heat management
The seconds big issue is to keep the tablet on for a long time (several days with the screen constantly working). We did some test where we measured the total power consumption when the tablet is connected to a power supply. We saw that the standard cheapo 1A / 5V power plug for phones doesn’t work on the long run. We also checked which part of the tablet uses the most power. Its by far the screen. Shutting down Wifi, Bluetooth, GPS, etc isn’t by far so power consuming. In the end we dimmed the screen to a low level, and got a setup working, where a tablet is taking less power than the charging capacity is.
Short notice: The Nexus 7 First Generation is having a worse charing behaviour: we didn’t find any configuration in which the device – with a constantly powered screen – is drawing less power than the power supply can give. So when the screen is constantly on and the tablet gets charged it will still loose power over the hours and in the end run empty. The second generation tablets seems to have a better power management, here it works.
We also cared about heat management and did a temperature measurement to check where the case heats the most. The picture below is the tablet Nexus 7, 2 Gen after 5h of use with a screen set to 10% brightness, WIFI on, a browser running and the nexus was in a case (no airflow). The warm part is around 40°- we decided that we don’t need ventilation or anything, but liked the photo, so we put it here.
SoftwareThe Software for the tablets, the server and sequencer was programmed by my beloved super-nerd Karsten. It runs on two devices: the server providing a HTML layout and running a small super collider sequencer and the tablets rendering the website (wrapped in an app for better convenience) and constantly communicating with the server.
There are generally two methods of how to program a beat: either you can record the beat live – this happens in a pad-triggered midi drum kit for example. The other option is to program the beat in a time-independent manner. This is normally done with a “piano roll” in music programs like Ableton Live, but can also be done in a step sequencer way (see below). We chose the last option because it restricts the input to a certain time grid – if a user is not so musically trained, it will still sound “ok” if he or she just presses “random” buttons (which certainly happens when its exposed to the public).
The step sequencer is – as explained above – a way to program beats in a time indepent way. It doesn’t react in real time to the pushing of a button. Instead you set steps in an (mostly) 16th grid. Selecting the bass drum and selecting every 4th button (4 in a 16th bar grid) will result in the standard 4-to-the-floor bassdrum. This is how the original 808 drum machine worked in 1981 and it still is a popular way to create beats. For our installation it has the advantage that also a non-musically trained audience can make “something” because pushing random buttons will also result in some kind of beat tied to the 16th note grid.
The server consists of a node.js server which provides a website to be displayed by the client. The musical structure is stored on the server in a Super collider patch. This is what the Super Collider homepage says about themselves:
[quote]SuperCollider is an audio server, programming language, and IDE for sound synthesis and algorithmic composition.[/quote] Super Collider stores an array of notes for the different instruments (BD, Sn, Tom). The step sequencer crawls the array with a fixed tempo (the classic disco tempo of 128BPM!) and outputs a midi not to the robot every time the sequencer sees a checked note.
The client on the other hand has a web socket connection to the node.js server, which manipulates the Super collider array. You can check out the github code here .
The grafical display ion the Nexus 7 tablets is HTML representation of the MR-808 installation. One can chose one instrument by clicking on the picture and program a rhythm by checking and un-checking the 16 buttons on the bottom.
On startup the app is fetching the HTML Website form the server. It opens a web socket connection with which it connects to the server sending the checked and unchecked notes and synchronizing the running light every 16 steps to the master clock of the server.
When building the system we had plans about multi user grafics – displaying different colors for different users so you can identify which beat was programmed from you. But this was too confusing and in the end not necessary.
Lessons learned & Outlook
We certainly did learn a lot in the six month while building this complex interactive setup. The biggest issue was to take the step from a barely working experiment to a stable prototype which would behave in most situations in the same way. This goes for software (e.g. didnt consider absurd users interactions) as well as for hardware (4 times in a row corrupted SD after shutting the Raspberry down by cutting the power? Check!).
- Next iteration of the system could involve streaming where the user can program the robot via “the Internet” and get a live stream back.
- Much requested was also a “social media” module where one can share audio or even video of material that the person has created.
- And last but not least: software and tablets are cool – but nothing beats a hardware interface. Two 3D-printed small step sequencers standing in front of the installation would be on the top of our whish list.
If you wonder what that sounds like: below you can find more than one hour of the audience of the Krake festival Berlin playing the interactive drum robot MR-808. All the sound and the patterns are generated from the audience – with sometime beautiful, sometimes ugly results.
Moritz Simon Geist is a musician and robot scientist. With his collective SONIC ROBOT he is constantly building musical robots, pushing the boundaries of electronic music and mechanic sound creation.
You can follow him on twitter here
- Check out our other projects
- Check out our next shows
- The code on github here
- MR-808 – the acoustic, electronic and mechanic construction