API’s (application programming interfaces) provide methods for programs (other than web browsers) to access Internet data. Any app that access data from the web uses an API.
For example, if you copy this URL into a web browser address bar, it will return a block of data in JSON format about the most popular videos on Vine: https://api.vineapp.com/timelines/popular
HTTP requests
An HTTP request transfers data to or from a server. A web browser handles HTTP requests in the background. You can also write programs that make HTTP requests. A program called “curl” runs http requests from the terminal command line. Here are examples: https://reactivemusic.net/?p=5916
Response data
Data is usually returned in one of 3 formats:
JSON
XML
HTML
JSON is the preferred method because its easy to access the data structure.
Max HTTP requests
There are several ways to make HTTP requests in Max, but the best method is the js object: Here is the code that runs the GET request for the Vine API:
function get(url)
{
var ajaxreq = new XMLHttpRequest();
ajaxreq.open("GET", url);
ajaxreq.onreadystatechange = readystatechange;
ajaxreq.send();
}
function readystatechange()
{
var rawtext = this._getResponseKey("body");
var body = JSON.parse(rawtext);
outlet(0, body.data.records[0].videoUrl);
}
The function: get() formats and sends an HTTP request using the URL passed in with the get message from Max. When the data is returned to Max, the readystatechange() function parses it and sends the URL of the most popular Vine video out the left outlet of the js object.
Playing Internet audio/video files in Max
The qt.movie object will play videos, with the URL passed in by the read message.
Unfortunately, qt.movie sends its audio to the system, not to Max. You can use Soundflower, or a virtual audio routing app, to get the audio back into Max.
A scene from a sea otter video processed using ‘Premium Cable’ by BPMC
Not modular
Two weeks ago Christopher Konopka demonstrated modular analog video. This week we will explore analog video made from found objects and circuit bending. https://reactivemusic.net/?p=18982
Kinect
Kinect is still your friend, especially if you use Windows.
“… A blog of short lessons on the topic of algorithmic composition — the use of formal systems to generate music (and, by extension, other types of time-based art) with computer programs. The examples in these lessons are provided in the form of Max programs.”
Several ways of working with Ableton Live parameters in a M4L patch. (This is an improved version of the patch we built in class) https://reactivemusic.net/?p=18401
leap motion
radio?
vine
mbta
midi osc thing / chat
frame subtraction
vizzie
voice cancellation thing
max for live – granulator or convolution reverb
show basics of max
make a spectrum analyzer
make a pitch detector tweet thing
Assignment
Read Max tutorials 1-4
Hello
Bang!
Numbers and Lists
Metro and Toggle
Build a control panel that does nothing
in Max. It should look amazing. It should be the coolest control panel you can imagine. Use any objects, colors, shapes that you can find. But… it shouldn’t actually control anything.
Office hours: Tuesday 1-2 PM, or Tuesday 4-5PM, at the EPD office #401 at 161 Mass Ave. Please email or call ahead.
Assignments and class notes will be posted to this blog: https://reactivemusic.net before or after the class. Search for: ep-426 to find the notes
Examples, software, links, and references demonstrated in class are available for you to use. If there is something missing from the notes, please ask about it. This is your textbook.
Syllabus:
Everybody calls this course “The Jitter class” – referring to Max/MSP jitter from Cycling 74. You will learn to use Jitter. But the object is to create interactive visual art. Jitter is one tool of many available.
The field of interactive visual art is constantly evolving.
After you take the course, you will have designed projects. You might design a new tool for other artists. You will have opportunities to solve problems. You will become familiar with how others make interactive art. You will explore the connection between sound, video, graphics, sensors, and data. You will be exposed to to a world of possibilities – which you may embrace or reject.
We will explore a range of methods and have opportunities to use them in projects. We’ll look at examples by artists – asking the question: How does that work?
Topics: (subject to change)
Jitter
Matrixes
Reverse engineering
Visualization of audio
Visualization of live data, API’s
Video analysis (realtime)
Video hardware and controllers
Prototyping
Video signal processing
OpenGL
Other tools: Processing, WebGL, Canvas, 2d graphics
Portfolios
Live performance
Grading and projects:
Grades are based on two projects that you will design – and class participation. Please see Neil Leonard’s EP-426 syllabus for details. I encourage and will give credit for: collaboration with other students, outside projects, performances, independent projects, and anything else that will foster your growth and success.
I am open to alternative projects. For example, if you want to use this course as an opportunity to develop a larger project or continue a work in progress.