Building a prototype

GOALS

My first goal is to the find a way for musicians and artists from different disciplines to collaborate in a exhibition space.

LISSON GALLERY listening room by Devon Turnbull

Lisson Gallery listening room by Devon Turnbull treats custom made HI-FI audio equipment as a sculpture.

Live performance by a musician or a DJ is more engaging. I’ve performed a live DJ set for the opening night of an exhibition in Bife Ventil.

Realistically, a live and a dynamic performance is difficult to organize for the entire duration of the exhibition that lasts more than a couple of days. Unless visitors perform music themselves. Which brings me to my second goal: audience participation.

Reimagining the Ecology of Art in u10

I’ve been lucky to participate in the Reimagining the Ecology of Art exhibition that encouraged visitors to bring their instruments and record pieces inspired by exhibited objects. Visitors that can’t bring an instrument are given five-in-one custom made instrument to play with other visitors. Recorded pieces can be listened to on a tablet device with headphones next the the object that inspired them.

The third goal is to encourage participation by ensuring visitors will create a sound that is pleasant and harmonious in a way that is easy and intuitive.

Play music with your touch, TouchMe device demonstration

I’ve experimented with a device called TouchMe that can transform human touch into sound. Participants hold the device in one hand and touch a person, who is doing the same thing, with their other hand. The creates a closed electric circuit that triggers MIDI signals to the computer. The computer interprets that signal and produces a sound by a virtual instrument. The pitch of the MIDI note corresponds to the amount of electrical current that passes through to the device and it can be increased by making the surface area of a touch larger.

My experiments sounded better if I had a backing track to play along. I would compose something in my DAW and play it on a loop. While the loop is running I would exclude some elements like percussion and bass for a couple of sections and then bring them back again. Controlling the mix of the song was as easy as pushing a button or turning a knob. That’s even something non-musicians can find enjoyable, impactful and pleasant sounding.

Stem player by Kanye West

Stem player by Kanye West is a device that can customize songs by isolating parts like voice, drums and bass. Customizing is done by changing the volume, speed and effects of these isolated parts in real-time. It was released as a commercial product and exclusive platform for listening to Kanye West’s Donda 2 album.

With clear goals in mind and an idea of what’s possible to do I came up with a solution.

SOLUTION

An audio player and mixer on the web that many people can access at the same time and control in real-time.

Audio mixer of a virtual bar

I Miss My Bar is a virtual bar which functions as an audio mixer. It has 8 channels, 7 of which are field recordings of ambient sounds from a bar. And one channel is a spotify playlist. Changing the volume of an audio channel will affects only my page.

I built a replica of this site using only HTML and javascript as a starting point. For more control I loaded audio files as arraybuffers and played them using Audio Web API instead of using audio tags. At the start of the page all the files are loaded, then played at the same time, muted and in a loop.

To make this simple player collaborative I needed a way to change the volume of one of the audio channels from a auxiliary page and have it affect the main page in real-time. That auxiliary page could be opened on a different device, by a different person in a different physical location. To communicate in real-time between pages I’ve decided to use websockets on a an express server instead of serverless peer-to-peer communication with WebRTC. The main page holds an open connection to the server and listens for messages from auxiliary pages. Auxiliary page controls send messages to an API endpoint on the server. The server emits these messages over the open connection it has with the main page.

Main page – Player

audio player that outputs audio and listens to changes
example url: coalescence.com/player

Auxiliary Pages – Controls

audio channel controls for volume, speed and effects
example url: coalescence.com/control?name=trackName

This setup allows multiple people to control the audio mixer in real-time. You can find the source code for a working prototype on github.

USE CASE

The audio player should be opened on a device connected to the main audio system of the space.

Musicians prepare audio tracks that can be looped and are in harmony when played at the same time with each other.

Tracks must form a coherent whole and have enough variation to be played for the entirety of the exhibition. This is a musical challenge. One of the more interesting approaches I’ve learned to overcome this challenge I’ve found in generative music. I’ll discuss them in one of my upcoming articles.

For music to fit the themes of an exhibition I recommend an audio track be created for each object in the exhibition. Access to the audio track control page should be placed near the object that inspired it. The easiest way for people to access its controls would be to scan a QR code on their phone.

This creates a dynamic environment in which sound is broadcasted to all the visitors, is controlled by the visitors in a way that’s a reaction to the object they received it from, they can ponder why the musician attributed a particular sound to this object. Most importantly they will always be reacting to the ever changing sound environment that all visitors are participating in, by adjusting tracks that are meant to sound harmonious together.

To join my private discord community and access exclusive content buy me a coffee.

By:

Posted in:


2 responses to “Building a prototype”

Leave a comment

Design a site like this with WordPress.com
Get started