Overlapping loops of varying duration to represent natural cycles.
In October I collaborated with Wade Kavanaugh and Stephen P. Nguyen to compose and perform the sounds of a glacier for their installation at the Gem theatre in Bethel, Maine. The glacier was made from paper.
Wade and Stephen:
A time-lapse video of the project:
A time-lapse video of a similar project they did in Minnesota 2005:
The approach was to take a series of ambient loops and organize them by duration. The longer loops would represent the slow movement of time. Shorter loops would represent events like avalanches. One-shot samples would represent quick events, like the cracking of ice.
It took several iterations to produce something slow and boring enough to be convincing. I used samples from the Ron MacLeod’s Cyclic Waves library from Cycling 74 https://www.ableton.com/en/packs/cyclic-waves/. Samples were pitched down to imply largeness.
Each vertical column in an Ableton Live set represents a time-frame of waves. That is, the far left column contains quick events and the far right column contains long cycle events. Left to right, the columns have gradually increasing cycle durations. I used a Push controller to trigger samples in real time as people walked through the theatre to see the glacier.
The theatre speakers were arranged in stereo but from front to back. Since the glacier was also arranged along the same axis, a slow auto-panning effect sent sounds drifting off into the distance, or vice versa. Visually and sonically there was a sense that the space extended beyond the walls of the theatre.
In the “control room” above the theatre… using Push to trigger samples and a Korg NanoKontrol to set panning positions of each track:
The performance lasted about 45 minutes. Occasionally the cracking of ice would startle people in the room. There were kids crawling around underneath the paper glacier. Afterwards we just let the sounds play on their own. A short excerpt:
Around the year 1700, several startup ventures developed prototypes of machines with thousands of moving parts. After 30 years of engineering, competition, and refinement, the result was a device remarkably similar to the modern piano.
What are the musical instruments of the future being designed right now?
Are there patterns in the ways that artists adapt technology?
For example, the Hammond organ borrowed ideas developed for radios. Recorded music is produced with computers that were originally as business machines.
Instead of looking forward to predict future music, lets look backwards to ask,”What technology needs to happen to make musical instruments possible?” The piano relies upon a single-escapement (1710) and later a double-escapement (1821). Real time pitch shifting depends on Fourier transforms (1822) and fast computers (~1980).
Artists often find new (unintended) uses for tools. Like the printing press.
The piano is still in development. In December 2014, Eren Başbuğ composed and performed music on the Roli Seaboard – a piano keyboard made of 3 dimensional sensing foam:
Here is Keith McMillen’s QuNexus keyboard (with Polyphonic aftertouch):
Harmonic Model Plus Residual (HPR) – Build a spectrogram using STFT, then identify where there is strong correlation to a tonal harmonic structure (music). This is the harmonic model of the sound. Subtract it from the original spectrogram to get the residual (noise).
What is the speed of electricity? 70-80 ms is the best round trip latency (via fiber) from the U.S. east to west coast. If you were jamming over the internet with someone on the opposite coast it might be like being 100 ft away from them in a field. (sound travels 1100 feet/second in air).
Global communal experiences – Bill McKibben – 1990 “The Age of Missing Information”
There is a quickening of discovery: internet collaboration, open source, linux, github, r-pi, Pd, SDR.
“Robots and AI will help us create more jobs for humans — if we want them. And one of those jobs for us will be to keep inventing new jobs for the AIs and robots to take from us. We think of a new job we want, we do it for a while, then we teach robots how to do it. Then we make up something else.”
“…We invented machines to take x-rays, then we invented x-ray diagnostic technicians which farmers 200 years ago would have not believed could be a job, and now we are giving those jobs to robot AIs.”
Sonification of Mass Ave buses, from Harvard to Dudley.
This patch sends requests to the MBTA developer portal to get the current location of buses – using the Max js object. Latitude and Longitude data is mapped to oscillator pitch. Data is polled every 10 seconds, but it seems like the results might be more interesting to poll at a slower rate, because the updates don’t seem that frequent. And buses tend to stop a lot.
You will not need authentication to run run this patch. It uses the default developer API-key for testing. Please read the terms of service at the MBTA developer portal. Data should not be polled more often than 10 seconds. You can also request your own developer API key from MBTA.
Toggle the metro (at the top of the patch) to start polling
Turn on the audio (at the bottom of the patch) and turn up the gain
Note: there will be more buses running during rush hours in Boston. Try experimenting with the polling rate and ramp length in the poly-oscillator patch. Also, you can experiment with the pitch range.
The Max patch is based on a tutorial by dude837 called “Automatic Silly Video Generator”
API’s (application programming interfaces) provide methods for programs (other than web browsers) to access Internet data. Any app that access data from the web uses an API.
An HTTP request transfers data to or from a server. A web browser handles HTTP requests in the background. You can also write programs that make HTTP requests. A program called “curl” runs http requests from the terminal command line. Here are examples: http://reactivemusic.net/?p=5916
Data is usually returned in one of 3 formats:
JSON is the preferred method because its easy to access the data structure.
Max HTTP requests
There are several ways to make HTTP requests in Max, but the best method is the js object: Here is the code that runs the GET request for the Vine API:
var ajaxreq = new XMLHttpRequest();
ajaxreq.onreadystatechange = readystatechange;
var rawtext = this._getResponseKey("body");
var body = JSON.parse(rawtext);
The function: get() formats and sends an HTTP request using the URL passed in with the get message from Max. When the data is returned to Max, the readystatechange() function parses it and sends the URL of the most popular Vine video out the left outlet of the js object.
Playing Internet audio/video files in Max
The qt.movie object will play videos, with the URL passed in by the read message.
Unfortunately, qt.movie sends its audio to the system, not to Max. You can use Soundflower, or a virtual audio routing app, to get the audio back into Max.