ep-341 Max/MSP – Spring 2015 week 13

Algorithmic composition and generative music – part 2

images

Reactive music

With reactive music, audio is the input. Music is the output. Music can also be the input.

from Wikipedia: http://en.wikipedia.org/wiki/RjDj

“Reactive music, a non-linear form of music that is able to react to the listener and his environment in real-time.[2] Reactive music is closely connected to generative musicinteractive music, and augmented reality. Similar to music in video-games, that is changed by specific events happening in the game, reactive music is affected by events occurring in the real life of the listener. Reactive music adapts to a listener and his environment by using built in sensors (e.g. cameramicrophoneaccelerometertouch-screen and GPS) in mobile media players. The main difference to generative music is that listeners are part of the creative process, co-creating the music with the composer. Reactive music is also able to augment and manipulate the listeners real-world auditory environment.[3]

What is distributed in reactive music is not the music itself, but software that generates the music…”

Ableton Live field recorder

Uses dummy clips to apply rhythmic envelopes and effects to ambient sound: https://reactivemusic.net/?p=2658

InstantDecomposer and Slice/Jockey

Making music from from sounds that are not music.

by Katja Vetter

InstantDecomposer is an update of Slice//Jockey. It has not been released publicly. Slice//Jockey runs on Mac OS, Windows, and Linux – including Raspberry-PI

http://www.katjaas.nl/slicejockey/slicejockey.html

Slice//Jockey help: https://reactivemusic.net/?p=19065

Slice//Jockey is written in Pd (PureData) – open source – the original Max.

By Miller Puckette

http://puredata.info

Local file reference

  • local: InstantDecomposer version: tkzic/pdweekend2014/IDecTouch/IDecTouch.pd
  • local: slicejockey2test2/slicejockey2test2.pd

Lyrebirds

A music factory.

By Christopher Lopez

RJDJ
Inception and The Dark Knight iOS apps:

As of iOS 8.2, Dark Knight crashes on load. Inception only works with “Reverie Dream” (lower left corner)

Running RJDJ scenes in Pd in Mac OS X

Though RJDJ is a lost relic in 2015. It still works in Pd. The example scenes used here are meant to run under libpd in iOS or Android, but they will actually work in Mac OS X.

Screen Shot 2015-04-18 at 10.13.01 PM

First, use Pd Extended. Ok maybe you don’t need to.

1. Read the article from Makezine by Mike Dixon http://makezine.com/2008/11/03/howto-hacking-rjdj-with-p/

2. Download sample scenes from here: http://puredata.info/docs/workshops/MobileArtAndCodePdOnTheIPhone The link is under the heading “RJDJ Sources”

3. Download RJLIB from here: https://github.com/rjdj/rjlib

4. Add these folders in RJLIB to your Pd path (in preferences)

  • pd
  • rj
  • deprecated

5. Now, try running the scene called “echelon” from the sample scenes you downloaded. It should be in the folder rjdj_scenes/Echelon.rj/_main.pd

  • turn on audio
  • turn up the sliders
  • you should here a bunch of crazy feedback delay effects

Note: with Pd-extended 0.43-4 the error message: “base: no method for ‘float'” fills the console as soon as audio is turned on.

Scenes that I have got to work:

The ones with marked with a * seem to work well without need for modification or an external GUI. They all have error messages) – and they really are meant to work on Mobile devices, so a lot of the sensor input isn’t there.

  • Amenshake (you will need to provide accelerometer data somehow)
  • Atsuke (not sure how to control)
  • CanOfBeats (requires accelerometer)
  • ChordSphere (sounds cool even without accelerometer)
  • Diving*
  • DubSchlep* (interesting)
  • Ehchelon*
  • Eargasm*
  • Echochamber*
  • Echolon*
  • Flosecond (requires accelerometer)
  • FSCKYOU* (Warning, massive earsplitting feedback)
  • Ghostwave* (Warning, massive earsplitting feedback)
  • HeliumDemon (requires accelerometer)
  • JingleMe*
  • LoopRinger*
  • Moogarina
  • NobleChoir* (press the purple button and talk)
  • Noia*
  • RingMod*
  • SpaceBox*
  • SpaceStation (LoFi algorithmic synth)
  • WorldQuantizer*

to be continued…

Random RJDJ links

Including stuff explained above.

Miscellaneous

 

I’m thinking of something: http://imthinkingofsomething.com

Analog video synthesis

Vintage consumer electronics, adapter cables, converters, and circuit bending.

e_clock2_1

Signal formats

Note: everything here refers to NTSC which is the standard video format in the USA.

Composite

“Composite video (one channel) is an analog video transmission (without audio) that carries standard definition video typically at 480i or 576i resolution.” -from Wikipedia

Composite uses the yellow RCA plugs found on most consumer video devices. It is also the standard for LZX analog modular systems http://www.lzxindustries.net.

Because composite is the most common format, the approach here will be to convert everything else to composite. With a few exceptions.

S-video

S-video uses several channels to convey AV data. The easiest way to convert S-video to composite, or vice-versa is to use a VCR or capture device that has s-video IO.

VHF analog TV

Older video devices use coaxial cables with F connectors to transmit modulated AV signals on channels 3 and 4.

RF modulators and demodulators
RF modulators

To convert from composite to TV use an RF modulator:

7162

You can also use a VCR as an RF modulator.

RF modulators can be used as TV transmitters.  https://reactivemusic.net/?p=12355

RF demodulators

To convert from TV to composite, use an RF demodulator. The demodulator removes the VHF carrier, leaving a baseband composite signal.

Demodulators can be expensive and difficult to find, so if you are not too concerned about image quality its easier to use a VCR. Connect the input signal to the TV coax input. Set the VCR to channel 3 or 4. The signal passes through to the composite output (yellow RCA jack). This method also works as an inexpensive method of TBC (time base correction).

VGA

Use a PC to TV converter to convert VGA signals to composite: https://reactivemusic.net/?p=18716

Screen Shot 2015-03-27 at 5.02.22 PM

Video capture

There are many options for video capture depending on which operating system and applications you are using.

Here are several methods that work with MAC OS

Capture devices
  • Elgato Video Capture – USB (only works through Elgato app)
  • Diamond Video Capture – USB (shows up as a video camera?)
Media converters

Grass Valley ADVC 110 Digital Video Converter – Firewire 400 http://www.grassvalley.com/products/advc110

Analog to digital pass-through

Some camcorders work as digital media converters. It may be difficult to determine which ones do this.   Here’s a list of Sony camcorders that have this capability: http://forum.videohelp.com/threads/355121-Sony-Handycam-Digital8-camcorders-with-analog-digital-passthru

Another list: http://fccug.org/2222-10b%20How%20to%20convert%20an%20analog%20signal%20to%20digital.pdf

Some home-theatre system receivers will convert a variety of formats. I have not used this method.

iOS, Android

To capture video from a mobile device, use a streaming program like Airplay or Airserver http://www.airserver.com/Mac, or an adapter cable like http://store.apple.com/us/product/MD098AM/A/apple-digital-av-adapter.

With MAC OS Yosemite Airplay capability might be built in?

Web cams

Macam supports a variety of web cams. https://reactivemusic.net/?p=19000

Examples

Here are some of my favorite ways to generate and process video.

Videonics Video Equalizer

32166_1_584d4a

Composite and S-video IO.

Camcorders with composite output

DCR-HC62_cw_1_540x435

Many camcorders have composite output. Often they will use a proprietary AV adapter cable. Classic Sony camcorders use an adapter cable with 3.5 mm plug connecting to 2 or 3 RCA plugs for composite video and audio.

Camcorders are the most versatile and interesting devices for synthesis. They are instant feedback machines. Just point the camera at the screen. Zoom for instant hallucinations. Some camcorders have filters that can be applied in real time.

VCR’s

VCR-1200

Older VHS VCR’s and tapes are inexpensive. Sometimes free. They provide an infinite variety of input signals. As well as a variety of signal conversion functions. Look for one that has a remote control.

Ancient analog TV’s

Portable TV

Use an RF modulator (see above) to watch composite video on a very old TV. You may also need an impedance transformer if the TV has screw terminals instead of a coax jack.

Here’s a Max patch running on a portable TV: https://reactivemusic.net/?p=18716

Atari Video Music c-240

atari-video-music-player-c240-c-240

The Atari Video Music generates geometric patterns from audio input. https://reactivemusic.net/?p=19004. It features a TV output, although oddly uses an RCA plug. Use an RF demodulator (see above) or a VCR to convert to composite.

BPMC  circuit bent analog video

premium3-600x398

http://glitchart.com

BPMC produces circuit bent versions of consumer video devices. I have been using the BPMC “Premium Cable” device. It is an amazing effects processor. It has composite IO (and S-video).

My First Sony

img_me-70058

A video version of MS-paint. It has composite output and is fun to draw with. The AC adaptor can be difficult to find. But it also works with D batteries.

alien

ep-426 Interactive Video – Spring 2015 week 12

(under construction)

Various topics

Screen Shot 2015-03-11 at 10.14.38 PM

Artists

Camille Utterback
Luke Dubois

http://www.lukedubois.com

Various examples: https://reactivemusic.net/?p=7769

Brian Eno
Tadej Droljc

Sonographic sound processing in Max https://reactivemusic.net/?p=16887

(Example of 3d speech processing at 4:12 in video)

local file: SSP-dissertation/4 – Max/MSP/Jitter Patch of PV With Spectrogram as a Spectral Data Storage and User Interface/basic_patch.maxpat

Try recording a short passage, then set bound mode to 4, and click autorotate

Blair Neal

Ancient history

Don Slepian 1983

http://en.wikipedia.org/wiki/Video_synthesizer

Chromascope https://reactivemusic.net/?p=18733

Franciszka and Stefan Themerson 1944/45

The Eye and the Ear: https://reactivemusic.net/?p=18707

The Outer Limits (TV) 1963

https://www.youtube.com/watch?v=8CtjhWhw2I8

https://www.youtube.com/watch?v=DY6y0zRCiOY

sound effects: https://www.youtube.com/watch?v=D_Ao7EKTNiA

Video

Sometimes we overlook easy paths:

More Jitter tricks

jit.ameba : https://reactivemusic.net/?p=18934

jit.cornerpin https://reactivemusic.net/?p=11838

jit.scope:

gifsyphon: https://reactivemusic.net/?p=18846

Brushtips: https://reactivemusic.net/?p=18835

Robomoves: https://reactivemusic.net/?p=18694

Silhouettes: https://reactivemusic.net/?p=18828

Analog modular video links:

links to analog video

 

ep-341 Max/MSP – Spring 2015 week 12

(under construction)

Algorithmic composition and generative music – part 1

Screen Shot 2015-04-13 at 11.09.59 PM

  • This week we will explore probability, randomness, and sequencing.
  • Next week we will look at audio reactive music
  • In part 3 We will sonify data and gestures

That’s a lot of buzzwords.

How do people compose music?
  • What is your composition tool/method of choice?
  • Dramatic shape https://reactivemusic.net/?p=17176
  • “How to Write a Song” By Henry Kane, 1962
  • “This is Your Brain on Music” by Daniel Levitin
3 approaches

Chris Dobrian  https://reactivemusic.net/?p=18914

  • systematic
  • intuitive
  • arbitrary
Framing the process of composition
Examples

The earliest examples of algorithmic composition applied mathematics to pitch, rhythym, harmony, and ensemble playing. Midi was an ideal medium for mathematical transformations. The examples we look at today are for the most part Midi based.

Composing with Max
Markov chains
  • Fuji Kureta: MarkovChain 2-fuji-kureta.maxpat (using Midi piano scores) debussy-cc4-format0.mid (The snow is dancing)
  • Mchain external by Richard Dudas – https://reactivemusic.net/?p=18926
    • text
    • interactive midi notes
References

Assignment

Design a generative music machine.

I would encourage you to collaborate. To use the work of other artists as a starting point. And to build a composition/performance tool that you would actually use.

This assignment will be due on the last class day of the semester (May 5th).

Kinect – revisited

Various ways Kinect 1 still runs in Mac OS with Max/MSP, Processing, and OSC.

Note: this is about the ‘old’ Kinects – not the latest versions (Kinect 2). Although, Dale Phurrough’s Max dp.kinect2 external works with Kinect 2 in Windows 8+.

(self portrait with Synapse)

Screen Shot 2015-04-04 at 4.38.25 PM

Synapse

Synapse converts skeletal data to OSC. It still runs with Max/MSP even though it is not supported. http://synapsekinect.tumblr.com

Here’s how to run with Max: http://synapsekinect.tumblr.com/post/6307752257/maxmsp-jitter

Don’t be surprised if Max crashes occasionally.

Kinect-Via-Synapse

Max/MSP examples of skeletal tracking

https://github.com/jpbellona/Kinect-Via-Synapse

Processing:

Processing uses OpenNI library. Use the Processing package manager to install or update the OpenNI library.

There are several built-in examples (under Contributed Libraries). Many of them work, including the “hands” example. Wave your hand a lot to get it to start tracking.

jit.openni and dp.kinect

http://hidale.com

Dale Phurrough’s free Max external jit.openni is no longer supported. I was not yet able to find a Mac version that runs. The dp.kinect external runs only in Windows.

dp.kinect is for Kinect 1 and up to Windows 7. dp.kinect2 requires Windows 8+. More testing on the way. Note that dp.kinect is a commercial product.

https://cycling74.com/toolbox/dp-kinect-external-using-microsoft-kinect-sdk/

 jit.freenect

http://jmpelletier.com/freenect/

Screen Shot 2015-04-04 at 5.02.46 PM

Provides depth camera data as a Jitter matrix. Various modes, including IR.

A tutorial by Peter Elsea: ftp://arts.ucsc.edu/Pub/ems/electronic-contraptions/Max%20and%20Kinect.pdf

SimpleK

icon_512x512@2x-300x300

Jon Bellona OSC/Kinect libraries for Processing

https://cycling74.com/toolbox/simplekinect/

I was able to run the Processing sketch and receive OSC data on port 8000 in Max – but the UI is somewhat confusing and there is no camera input to monitor skeleton tracking. This probably would not be difficult to add to the sketch by looking at the SimpleOpenNI examples.

 

 

 

Physics simulation driven by audio

From tutorial 21b by dude837

https://www.youtube.com/watch?v=COl3ft1PPNU

Screen Shot 2015-03-23 at 1.46.04 AM

What’s different?

There was a problem with the spheres not returning to the resting position. They were constantly expanded outward. By removing the frame rate trigger from qmetro, and triggering only when audio data is received, the response was improved. You can also set signal amplitude going into the bonk~ object.

Screen Shot 2015-03-23 at 1.45.47 AM

Download

https://github.com/tkzic/max-projects

Folder: physics

Project: bumper-phsyics

External objects: bonk~ from: http://vud.org/max/

ep-341 Max/MSP – Spring 2015 week 9

Design a synthesizer is Max – Part 2

Screen Shot 2015-03-26 at 3.21.17 PM

Topics:
  • Waveform select
  • Recording
  • Polyphony
  • Presets
  • Max For Live
Click this link for Notes on Poly~, the topics above and patches for the synthesizer we built in class:
https://reactivemusic.net/?p=18573

Miscellaneous

The hi object: for reading human interface devices (like game controllers. Use the code in the Max help file to get started.

Boids sonification: https://reactivemusic.net/?p=18388

Random walk sonification: https://reactivemusic.net/?p=18455

Questions:

How would you design a patch to automatically select (trim) a sound from a buffer that contained silence and background noise as well as music?

How do you use wireless controllers, like a wiimote, with Max?

Assignment

Demonstrate M4L devices in class next week.

ep-426 – Interactive video – Spring 2015 week 9

(totally under construction)

Various connections

Forward-Biased_pn_Junction.svg

 

Jitter + Processing

Example by Peter Wiessenthaner https://reactivemusic.net/?p=18529

Download Processing

https://processing.org

Examples: https://reactivemusic.net/?p=18655

OSC control of Processing with Max

Visualization of an audio signal from Max.

Tutorial by dude837

Max patch and processing sketch: http://otherbirds.com/tutorials (Processing 1: Deception)

Sending video from Processing to Jitter with Syphon

Screen Shot 2015-03-21 at 10.19.29 PM

Processing
  • add the “Syphon” library to Processing (sketch | import library | add library)
  • Load the “sendscreen” example sketch (file | examples | contributed libraries | syphon).
Max
Processing
  • Run the sendscreen sketch – you should see the output in both a Processing and a Jitter window.

Youtube in Max

Cefwithsyphon for Max (Web video streaming): https://reactivemusic.net/?p=11371

Video screen capture in Max

jit.desktop

p5.js and p5js.sound

Processing in a Web browser.

Miscellaneous

Assignment:

Next week we will present mid-term projects in class

ep-426 – Interactive video – Spring 2015 week 8

Screen Shot 2015-03-23 at 1.59.10 AM

Miscellaneous topics

(In addition to looking at student work in progress.)

Ableton Push as low resolution video display: https://reactivemusic.net/?p=18473

Jitter Open GL Physics example fixed (dude837 21b “bumper”) https://reactivemusic.net/?p=18585

Random walk synthesizer: https://reactivemusic.net/?p=18455

Basis functions using jit.bfg – an alternative to ordinary noise. https://reactivemusic.net/?p=18594

Not covered:

Rokvid M4L (Adam Rokhsar)

The difference between Jitter video matrixes and Open GL matrixes

image playlists (picturething2)

opening up vizzie and bpatchers in general