The future of human computer interaction. Using sensors to detect movement.
By Meredith Levinson at cio.com
http://www.cio.com/slideshow/detail/20266/The-Future-of-Human-Computer-Interfaces#slide1
The future of human computer interaction. Using sensors to detect movement.
By Meredith Levinson at cio.com
http://www.cio.com/slideshow/detail/20266/The-Future-of-Human-Computer-Interfaces#slide1
Using Ableton Live and Arduino.
at Synthtopia
http://www.synthtopia.com/content/2012/08/08/light-controlled-dubstep-bass/
By Chris Muir
http://www.xfade.com/max/examples/
By Eigenfel
http://www.sfu.ca/~eigenfel/software.html
If this then that. An internet sensor/switching system
For example, you can receive email of buzzfeed’s cat posts. Or trigger a Belkin WeMo. Or an Arduino sketch via Twitter.
Using Syphon to get Max/MSP visuals into Quartz Composer.
By Iyad Assaf
Osc interface for Max and Ableton Live
By Ryan Challinor
http://synapsekinect.tumblr.com/post/6307790318/synapse-for-kinect
Synapse is an open source skeleton driver that sends out skeleton data via OSC. The data can then be processed in Max or M4L. There are interesting M4L devices available in a set called dubkinect that demonstrates body movement controlling Midi devices and triggering clips.
Using this technique today I was able to create a 3D air piano. You can send the Kinect data from Synapse into QuartzComposer at the same time you are doing the M4L audio program.
The Beat Wheel (by Ryan Challinor) : http://youtu.be/napIffdEHdk
code: http://wiki.musichackday.org/index.php?title=Kinect_BeatWheel
Note: the code crashed in Max6 but worked fine in Max5 on this 30th of July…
(Edit 9/24/2012) Just started looking at the patch for beatWheel – need to parse out the hand movement stuff and also look at the Max for Live dubkinect stuff to get a better sense of what the OSC commands are and the data range for various movements.
A Kinect demo that pretty much explains everything…
By Emily Gobeille and Theo Watson. Article by John Baichtal at Make.