Jitter example patches that respond to audio amplitude
(in Max/MSP examples/jitter-examples/audio)
By cycling74.com
Maxtunes
Maxtunes2
Using video to generate audio.
jit.peek~ outputs an audio signal from video matrix data.
(jit.peek~ help)
from cycling74.com
At any point in the the process, matrix data can be tapped to produce an audio signal. Analyzing video is very much like analyzing audio. For example, using envelope following and spectral analysis. Here are examples.
1. jit.peek~-additivesynth.maxpat (in Max/MSP examples/jitter-examples/audio)
Draw a song
By Adam Florin and Joshua Kit Clayton
2. Frame subtraction
By Adam Rokshar
https://reactivemusic.net/?p=7005
3. Sound Emotion2 – using Macbook built-in camera
By Andreas Wittich
https://reactivemusic.net/?p=9225
4. jit.peek~-osctrack.maxpat (in Max/MSP examples/jitter-examples/audio)
Derives rhythmic audio data from video. (Using bball.mov)
By cycling74.com
5. Whispering Heights (in Max/MSP examples/jitter-examples/3rd-party/image-to-spectral-filter/whispering_heights.maxpat)
Similar to above but uses video to create a moving spectral filter.
6. Wolframatic.maxpat (in Max/MSP examples/jitter-examples/other/Wolframatic/)
Fractal generator also generates audio.
By R. Luke Dubois
Video projection mapping using [jit.gl.cornerpin]
https://github.com/tkzic/max-projects
folder: vizzie-projection-map
patch: cornerpin-test.maxpat
The vizzie RECORDR module only records video. Use [jit.vcr] to record audio and video
Reverse engineering the “Bloom” logo
https://github.com/tkzic/max-projects
folder: blend
patch: circleblender2.maxpat
Live chroma-key example using the built-in camera.
Hacked from jitter help files…
https://github.com/tkzic/max-projects
folder: chroma-key
patch: greenScreen1.maxpat
This really works best with an external camera and a large monochromatic background.
components:
Stream realtime video from a web browser into Max or any program that uses Syphon.
CefWithSyphon (developed by Vibeke Bertelsen) launches a web browser and Syphon server. The Max patch operates a Syphon client that receives a video stream from the server and makes it available to Jitter objects.
https://github.com/tkzic/max-projects
folder: web-video-streaming
patch: jit.gl.syphonclient.maxhelp
Note: externals are included with max-projects but can also be downloaded here: http://syphon.v002.info
Download the CefWIthSyphon app from here: https://github.com/vibber/CefWithSyphon – A Mac OS binary is available and has been tested with Mac OS 10.9.2
By Wesley Smith and Graham Wakefield
Tutorial and externals for Max (Jitter) that produce visualization and sonification of scientific data.
How many musicians do you know that play with their eyes closed? Not many computer music apps allow this. Bloom is an exception. http://www.generativemusic.com/bloom.html
As an exercise, I tried to make something like Bloom, using Leap Motion. With your eyes closed you can accurately position your hand at the level of eyes, shoulders, hips, etc., And you can quickly move to a point directly outward from your nose, or shoulders. This is the basis of sobriety tests.
The interface, works with a hand motion like sprinkling seeds. Every time you open your hand, it triggers a note based on the height of your hand. It also triggers one of the “Bloom” circles at the XY position.
The prototype was done in Max/MSP Jitter. It was derived from a “bloom clone” project by John Mayrose at: http://www.johnmayrose.com/maxmsp.html
Here’s an example:
https://github.com/tkzic/max-projects
folder: bloom
patches
Note: If you don’t have a Leap Motion sensor, you can use a mouse.
If you are using Leap Motion, download the aka.leapmotion external – and add it to your Max file path in options | file preferences: http://akamatsu.org/aka/max/objects/
(if not using Leap Motion sensor, skip to step 4)