MBTA API in Max

Sonification of Mass Ave buses, from Nubian to Harvard.

Updated for Max8 and Catalina

This patch requests data from MBTA API to get the current location of buses – using the Max js object. Latitude and Longitude data is mapped to oscillator pitch. Data is polled every 10 seconds, but it seems like the results might be more interesting to poll at a slower rate, because the updates don’t seem that frequent. And buses tend to stop a lot.

Original project link from 2014: https://reactivemusic.net/?p=17524

MBTA developer website: https://www.mbta.com/developers

This project uses version 3 of the API. There are quality issues with the realtime data. For example, there are bus stops not associated with the route. The direction_id and stop_sequence data from the buses is often wrong. Also, buses that are not in service are not removed from the vehicle list or indicated as such.

The patch uses a [multislider] object to graph the position of the buses along the route – but due to the data problems described above, the positions don’t always reflect the current latitude/longitude coordinates or the bus stop name.

download

https://github.com/tkzic/internet-sensors

folder: mbta

patches:

  • mbta.maxpat
  • mbta.js
  • poly-oscillator.maxpat
authentication

You will need  to replace the API key in the message object at the top of the patch with your own key. Or you can probably just remove it. The key distributed with the patch is fake. You can request your own developer API key from MBTA. It’s free.

instructions
  • Open mbta.maxpat
  • Open the Max console window so you can see what’s happening with the data
  • click on the yellow [getstops] message to get the current bus stop data
  • Toggle the metro (at the top of the patch) to start polling
  • Turn on the audio (click speaker icon) and turn up the gain

Note: there will be more buses running during rush hours in Boston.  Try experimenting with the polling rate and ramp length in the poly-oscillator patch. Also, you can experiment with the pitch range.

Analog video synthesis

Generative art in motion.

ck

https://vimeo.com/cskonopka

By Christopher Konopka

Background

In the past year, Chris has published nearly 2500 improvised video pieces.

13522998_945476555985_3590046569317439705_o

You may be familiar with analog modular audio synthesis. The hardware to produce video looks nearly identical – a maze of patch cords and dials.

Television

13709835_10154366721684231_6046749253184273850_n

Analog video is television. A CRT (cathode ray tube) resynthesizes video information by demodulating signals from a camera. Vintage televisions have dials to adjust color and vertical sync. When you turn the dials you are synthesizing analog video. Distortion, filtering, and feedback – either at the source (camera) or the destination (tv screen) – offer up an infinite variety of images.

Analog vs. Digital

Today all media is digital. Like the screen you are looking at. The difference with analog is in how it’s produced. Boundaries are less definite. Lines curve. Colors waver. Feedback looks like flames. Every frame is a painting.

https://vimeo.com/172035463

Patterns

Images can be generated electronically using modules – without a camera.

Filters

Like with audio sampling, anything is a source. Movies, Youtube, live television, even Felix the Cat.

Feedback

When you aim a guitar at an amplifier it screams. Tilt it away slightly and the screaming subsides. In between there’s sweet spot. The same is true with cameras and screens. Feedback results when output is mixed with input.

Radio

Analog shortwave radio signals are distorted by the atmosphere in a manner similar to video filtering.

A studio in Bethel, Maine.

image1

An improvised collaboration between Chris and Tom Zicarelli using shortwave radio processed with audio effects.

Live Performance

Gem

https://www.instagram.com/p/BImQwOGBveV/?taken-by=cskonopka

A recent screen test at the Gem Theatre in Bethel, Maine. Source material is a time lapse film of a glacier installation – produced at the same theatre – by Wade Kavanaugh and Steven Nguyen. https://www.youtube.com/watch?v=6c36Y-Dcj30  The film was re-synthesized using analog video and feedback. Soundtrack by Tom Zicarelli.

https://www.instagram.com/p/BImRSzHBOLL/?taken-by=cskonopka

Big screen equals mind bending experience.

Note: previous clip excerpted from this 15 minute jam: https://vimeo.com/177843310

TAL

The patterns in this clip appear to be three dimensional. They are not.

From a show that happened somewhere in the known universe:

Alto

Improvised analog video with the band “Alto”. Patterns reminiscent of magical textiles.

More about analog video synthesis

 

Pianophase

Based on the first section from Steve Reich’s 1967 piece Piano Phase.

“Two pianists repeat the same twelve note sequence, but one gradually speeds up. Here, the musical patterns are visualized by drawing two lines, one following each pianist.”
by chenalexander.com

http://www.pianophase.com/
Screen Shot 2016-04-28 at 1.02.14 AM

Soundjack

Low latency open-source VOIP system for music

By Alexander Carot

http://www.soundjack.eu/

Screen Shot 2016-04-07 at 2.41.33 AM

I have run Soundjack as a one-way system for sending audio from a shortwave radio over the internet. It has worked well with Windows 7 and Mac OS 10.x. The Windows version requires an ASIO audio interface.