Max/MSP folder: applications/max6.1/patches/m4l-patches/Pluggo for Live Resources/patches
Here you will find things like pluggo.randomPreset that gets called with sends/receives in the Pluggo patches. A challenge will be how to randomize ‘any’ params in any patch? Without knowing the ranges?
Build a control panel in Max. It should look amazing. It should be the coolest control panel you can imagine. Use any objects, colors, shapes that you can find. But… it shouldn’t actually control anything.
Due: Have a first draft on your computer for next week. Final version due before class on 9/23. Email me the Max patch (maxpat) file or a link.
The zapier.com trigger method of sending tweets from Max is limited by number of tweets and sync rate. So it would be nice to set up another intermediary server program in ruby or php which eliminates the middle-man and just sends tweets directly.
Or you could use the mxj searchTweet program, which has been updated to do this on the search side.
MAX_ATTEMPTS = 3
num_attempts = 0
begin
num_attempts += 1
retweets = Twitter.retweeted_by_user("sferik")
rescue Twitter::Error::TooManyRequests => error
if num_attempts <= MAX_ATTEMPTS
# NOTE: Your process could go to sleep for up to 15 minutes but if you
# retry any sooner, it will almost certainly fail with the same exception.
sleep error.rate_limit.reset_in
retry
else
raise
end
end
This project uses Max/MSP to control and track a Parrot AR-drone quadcopter, using an intermediary server which runs the (open source) node-ar-drone in node.js. https://github.com/felixge/node-ar-drone
tz-dronestream-server/index.html (video client – called automatically by the video server)
installing node.js and dependencies:
Install node.js on your computer. Instructions here: http://nodejs.org
The following node packages are required. Install using npm. For example:
npm install request
request
xml2js
util
http
socket.io
Also, install following packages for the ar-drone and video streaming:
ar-drone
dronestream
how to run
For this experiment, we will be running everything on the same computer.
1. Connect your computer to the AR drone Wifi network. For example mine is: ardrone2_260592 – Note: after you do that, you will not be able to read this post on the internet.
2. Run both of the node programs in from terminal windows:
Since you are running Max control dashboard on the same computer as the server – you can call it without args, like this:
node drone5.js
Then from another terminal window start the video server:
node tz-dronestream-server/app-tz.js
3. In a Chrome web browser, type the following URL: (you can make multiple simultaneous connections to the video server) You should see the video from the AR-drone at in the browser.
127.0.0.1:5555
4. Load the following Max patch (control dashboard)
drone4.maxpat
5. In the Max patch, try clicking the /takeoff and /land messages in the Max patch.
Max programming tips
To control the drone from Max, use [udpsend] [udpreceive] with ports 4000 and 4001 respectively. You can’t make multiple connections with OSC – also it would probably not be so cool while flying. but you can specify a target ip for telemetry when running the OSC server.
We will eventually publish a complete list of commands, but they are using the API from the ar-drone docs readme file – converted into OSC style. For example:
/takeoff
/up .5
/animate flipAhead 2000
More notes on video…
You can capture the video stream into Max, by either capturing the chrome window using jitter, or by using syphon – but for demo purposes I have just run Chrome window side by side with Max control patch.
You may find it more practical to run the node.js server on a separate computer. If you do that you will need to
modify the dronestream script: app-tz.js to insert the proper ip address in the server.listen() – which should be the last line of the program. You will also need to use that address as the URL in Chrome, for example: 192.168.1.140:5555
And include the controller ip address on the command line as shown below
When testing this I set up a dual IP address on my Macbook with a static ip: 192.168.1.140 – so this would always be the server. I ended up getting rid of it because it caused problems with other software.
Here is the command you would use to specify a separate IP address when launching the server:
For example if your Max control program is on 192.168.1.104 and you want to run in outdoor mode – use this command:
node drone5.js 192.168.1.104 TRUE
program notes
These students are just about to send the quadcopter into the air using control panels developed in Max. Ali’s control panel uses speech via the Google API. Her computer is connected to the Internet via wiFi and also connected to Chase’s computer via a Midi/USB link. Her voice commands get translated into Midi. Chase’s control panel reads the commands. Chase’s computer is on the same WiFi network as the quadcopter. Chase’s control panel sends commands to my computer which is running Max and the at-drone software in node.js. Occasionally this all works. But there is nobody to hold a camera.
We’re now running two node servers, one for Max and one for web video streaming – which can be accessed by other computers connected to the same LAN as the AR-drone.
We did have a mishap where Chase’s control panel sent an “/up” command to the quadcopter. Then his Macbook batter died as the quadcopter was rising into the sky. I managed to rewrite the server program, giving it a /land command – then restarted it. It was able to re-establish communication with the quadcopter and make it land.
Unfortunately we did not get video of this experiment but here are a few seconds of video showing the quadcopter taking off and landing under control of Max – while indoors.
What if you used that data to reconstruct music by driving a sequencer in Max? The analysis is a series of time based quanta called segments. Each segment provides information about timing, timbre, and pitch – roughly corresponding to rhythm, harmony, and melody.
Edit the ruby server file: echonest-synth2.rb replacing the API with your new API from echonest
installing ruby gems
Install the following ruby gems (from the terminal):
gem install patron
gem install osc-ruby
gem install json
gem install uri
instructions
1. In Terminal run the ruby server:
./echonest-synth2.rb
2. Open the Max patch: echonest-synth4.maxpat and turn on the audio.
3. Enter an Artist and Song title for analysis, in the text boxes. Then press the greet buttons for title and artist. Then press the /analyze button. If it works you will get prompts from the terminal window, the Max window, and you should see the time in seconds in upper right corner of the patch.
If there are problems with the analysis, its most likely due to one of the following:
artist or title spelled incorrectly
song is not available
song is too long
API is busy
If the ruby server hangs or crashes, just restart it and try again.
3. Press one of the preset buttons to turn on the tracks.
4. Now you can play the track by pressing the /play button.
The Mixer channels from Left to right are:
bass
synth (left)
synth (right)
random octave synth
timbre synth
master volume
gain trim
HPF cutoff frequency
You can also adjust the reverb decay time and the playback rate. Normal playback rate is 1.
programming notes
Best results happen with slow abstract material, like the Miles (Wayne Shorter) piece above. The bass is not really happening. Lines all sound pretty much the same. I’m thinking it might be possible to derive a bass line from the pitch data by doing a chordal analysis of the analysis.
Here are screenshots of the Max sub-patches (the main screen is in the video above)
Timbre (percussion synth) – plays filtered noise:
Random octave synth:
Here’s a Coltrane piece, using roughly the same configuration but with sine oscillators for everything:
There are issues with clicks on the envelopes and the patch is kind of a mess but it plays!
Several modules respond to the API data:
tone synthesiszer (pitch data)
harmonic (random octave) synthesizer (pitch data)
filtered noise (timbre data)
bass synthesizer (key and mode data)
envelope generator (loudness data)
Since the key/mode data is global for the track, bass notes are probable guesses. This method doesn’t work for material with strong root motion or a variety of harmonic content. Its essentially the same approach I use when asked to play bass at an open mic night.
The envelopes click at times – it may be due to the relaxed method of timing, i.e.., none at all. If they don’t go away when timing is corrected, this might get cleaned up by adding a few milliseconds to the release time – or looking ahead to make sure the edges of segments are lining up.
[update] Using the Max [poly~] object cleared up the clicking and distortion issues.
Timbre data drives a random noise filter machine. I just patched something together and it sounded responsive – but its kind of hissy – an LPF might make it less interesting.
Haven’t used any of the beat, tatum, or section data yet. The section data should be useful for quashing monotony.
another update – 4/2013
tried to write this into a Max4Live device – so that the pitch data would be played my a Midi (software) instrument. No go. The velocity data gets interpreted in mysterious ways – plus each instrument has its own envelope which interferes with the segment envelopes. Need to think this through. One idea would be to write a device which uses EN analysis data for beats to set warp markers in Live. It would be an amazing auto-warp function for any song. Analysis wars: Berlin vs. Somerville.