The Biggest Optical Feedback Loop in the World
By Blair Neal
http://blairneal.com/blog/the-biggest-optical-feedback-loop-in-the-world/
Animated GIFs, Syphon, open source.
http://vdmx.vidvox.net/blog/giftosyphon
Yes I’m thinking the same thing. Cat videos in Max!
Uses Giphy: http://giphy.com
By Blair Neal
Note: The Max patch crashes Max 6 and 7 – as soon as the camera detects movement. This message precedes the crash: “our pool count isn’t zero (3), must be calling from the wrong thread…living dangerously”
From the same author: http://blairneal.com/blog/were-live-live-tv-face-substitution/
Using Syphon and VDMX
Painting in Max/MSP with various brush styles.
By Nesa Popov
From this post: https://cycling74.com/forums/topic/live-drawing-w-jitter-wacom-tablet/
The brush styles are contained in the frames of a Quicktime movie.
Also see “Harmony drawing App” https://reactivemusic.net/?p=3806
This patch was copied from the Cycling74 forum and is distributed without permission of the author.
https://github.com/tkzic/max-projects
Folder: brushtips
patch: brushtips.maxpat
Various methods of separating foreground from background.
By Andrew Benson
https://cycling74.com/2009/10/26/making-connections-camera-data/
By Jean-Marc Pelletier
Extract edge pixels from a binary image
Isolate a single connected component from a binary image
(a sock-monkey)
Synapse http://synapsekinect.tumblr.com
by Ryan Challinor
(Using jit.synapse from: http://synapsekinect.tumblr.com/post/6307752257/maxmsp-jitter – Can you find the sock monkey?)
“Commissioned for the BBC’s make it digital event, the brief was ‘to get children into code’. My installation downloaded the event’s twitter feed in real time and displayed the page’s body text inside the bodies of passing people. Moving their hands around allowed people to scroll through the html/js/CSS.”
By Robin Price
http://robinprice.net/2015/03/do-you-see-yourself-in-code/
This project uses:
“From crystal set to stereo”
By Miomir Filipovic
Free online book: http://www.mikroe.com/old/books/rrbook/rrbook.htm
A soundscape that responds to color.
By Helen Trevillion
The Max patch is not available. From the video it appears that many channels of sound are playing concurrently. Color values are assigned to faders for each channel.
Kinect and Max/MSP
By Zachary Seldess
Uses jit.freenect and jit.cv By Jean-Marc Pelletier
Various ways Kinect 1 still runs in Mac OS with Max/MSP, Processing, and OSC.
Note: this is about the ‘old’ Kinects – not the latest versions (Kinect 2). Although, Dale Phurrough’s Max dp.kinect2 external works with Kinect 2 in Windows 8+.
(self portrait with Synapse)
Synapse converts skeletal data to OSC. It still runs with Max/MSP even though it is not supported. http://synapsekinect.tumblr.com
Here’s how to run with Max: http://synapsekinect.tumblr.com/post/6307752257/maxmsp-jitter
Don’t be surprised if Max crashes occasionally.
Kinect-Via-Synapse
Max/MSP examples of skeletal tracking
https://github.com/jpbellona/Kinect-Via-Synapse
Processing uses OpenNI library. Use the Processing package manager to install or update the OpenNI library.
There are several built-in examples (under Contributed Libraries). Many of them work, including the “hands” example. Wave your hand a lot to get it to start tracking.
Dale Phurrough’s free Max external jit.openni is no longer supported. I was not yet able to find a Mac version that runs. The dp.kinect external runs only in Windows.
dp.kinect is for Kinect 1 and up to Windows 7. dp.kinect2 requires Windows 8+. More testing on the way. Note that dp.kinect is a commercial product.
https://cycling74.com/toolbox/dp-kinect-external-using-microsoft-kinect-sdk/
http://jmpelletier.com/freenect/
Provides depth camera data as a Jitter matrix. Various modes, including IR.
A tutorial by Peter Elsea: ftp://arts.ucsc.edu/Pub/ems/electronic-contraptions/Max%20and%20Kinect.pdf
Jon Bellona OSC/Kinect libraries for Processing
https://cycling74.com/toolbox/simplekinect/
I was able to run the Processing sketch and receive OSC data on port 8000 in Max – but the UI is somewhat confusing and there is no camera input to monitor skeleton tracking. This probably would not be difficult to add to the sketch by looking at the SimpleOpenNI examples.