by Rich Holoch
by Rich Holoch
Use on of these to replace a potentiometer with an Arduino, for example
Another approach is gluing an LED to an LDR…
Voltage controlled variable capacitors
Also called varicaps. They are diodes operated in a reverse bias condition. As voltage increases the capacitance decreases.
Tutorial by Phillip Atchely, KO6BB http://www.qsl.net/ko6bb/varactor.html
Using Varactors by Stefan Hollos and Richard Hollos: http://www.exstrom.com/journal/varac/varac.pdf
Tutorial by Ian Poole at Radio-Electronics http://www.radio-electronics.com/info/data/semicond/varactor-varicap-diodes/basics-tutorial.php
Another tutorial from Radio-Electronics?: http://www.radio-electronics.com/info/data/semicond/varactor-varicap-diodes/circuits.php
Varactor tuned regenerative radio by Tony G4WIF “The Two Dollar Regen” : http://www.cqham.ru/forum/attachment.ph … ntid=28430 More information here: http://theradioboard.com/rb/viewtopic.php?t=3568
A compilation of Pd help screens.
Slice//Jockey is an interactive audio performance instrument
by Katja Vetter
Click any of the screenshots to see a full size image.
The x-y origin is in the lower left corner. In the x plane numbers increase from left to right. In the y plane numbers increase from bottom to top.
Notes on notes:
Slice units left and right
Algorithmic composition and generative music – part 2
With reactive music, audio is the input. Music is the output. Music can also be the input.
from Wikipedia: http://en.wikipedia.org/wiki/RjDj
“Reactive music, a non-linear form of music that is able to react to the listener and his environment in real-time. Reactive music is closely connected to generative music, interactive music, and augmented reality. Similar to music in video-games, that is changed by specific events happening in the game, reactive music is affected by events occurring in the real life of the listener. Reactive music adapts to a listener and his environment by using built in sensors (e.g. camera, microphone, accelerometer, touch-screen and GPS) in mobile media players. The main difference to generative music is that listeners are part of the creative process, co-creating the music with the composer. Reactive music is also able to augment and manipulate the listeners real-world auditory environment.
What is distributed in reactive music is not the music itself, but software that generates the music…”
Uses dummy clips to apply rhythmic envelopes and effects to ambient sound: http://reactivemusic.net/?p=2658
Making music from from sounds that are not music.
by Katja Vetter
InstantDecomposer is an update of Slice//Jockey. It has not been released publicly. Slice//Jockey runs on Mac OS, Windows, and Linux – including Raspberry-PI
Slice//Jockey help: http://reactivemusic.net/?p=19065
Slice//Jockey is written in Pd (PureData) – open source – the original Max.
By Miller Puckette
Local file reference
A music factory.
By Christopher Lopez
As of iOS 8.2, Dark Knight crashes on load. Inception only works with “Reverie Dream” (lower left corner)
Though RJDJ is a lost relic in 2015. It still works in Pd. The example scenes used here are meant to run under libpd in iOS or Android, but they will actually work in Mac OS X.
First, use Pd Extended. Ok maybe you don’t need to.
1. Read the article from Makezine by Mike Dixon http://makezine.com/2008/11/03/howto-hacking-rjdj-with-p/
2. Download sample scenes from here: http://puredata.info/docs/workshops/MobileArtAndCodePdOnTheIPhone The link is under the heading “RJDJ Sources”
3. Download RJLIB from here: https://github.com/rjdj/rjlib
4. Add these folders in RJLIB to your Pd path (in preferences)
5. Now, try running the scene called “echelon” from the sample scenes you downloaded. It should be in the folder rjdj_scenes/Echelon.rj/_main.pd
Note: with Pd-extended 0.43-4 the error message: “base: no method for ‘float'” fills the console as soon as audio is turned on.
The ones with marked with a * seem to work well without need for modification or an external GUI. They all have error messages) – and they really are meant to work on Mobile devices, so a lot of the sensor input isn’t there.
to be continued…
Including stuff explained above.
I’m thinking of something: http://imthinkingofsomething.com
USB webcam support for Mac OS X.
Vintage consumer electronics, adapter cables, converters, and circuit bending.
Note: everything here refers to NTSC which is the standard video format in the USA.
Composite uses the yellow RCA plugs found on most consumer video devices. It is also the standard for LZX analog modular systems http://www.lzxindustries.net.
Because composite is the most common format, the approach here will be to convert everything else to composite. With a few exceptions.
S-video uses several channels to convey AV data. The easiest way to convert S-video to composite, or vice-versa is to use a VCR or capture device that has s-video IO.
Older video devices use coaxial cables with F connectors to transmit modulated AV signals on channels 3 and 4.
To convert from composite to TV use an RF modulator:
You can also use a VCR as an RF modulator.
RF modulators can be used as TV transmitters. http://reactivemusic.net/?p=12355
To convert from TV to composite, use an RF demodulator. The demodulator removes the VHF carrier, leaving a baseband composite signal.
Demodulators can be expensive and difficult to find, so if you are not too concerned about image quality its easier to use a VCR. Connect the input signal to the TV coax input. Set the VCR to channel 3 or 4. The signal passes through to the composite output (yellow RCA jack). This method also works as an inexpensive method of TBC (time base correction).
Use a PC to TV converter to convert VGA signals to composite: http://reactivemusic.net/?p=18716
There are many options for video capture depending on which operating system and applications you are using.
Here are several methods that work with MAC OS
Grass Valley ADVC 110 Digital Video Converter – Firewire 400 http://www.grassvalley.com/products/advc110
Some camcorders work as digital media converters. It may be difficult to determine which ones do this. Here’s a list of Sony camcorders that have this capability: http://forum.videohelp.com/threads/355121-Sony-Handycam-Digital8-camcorders-with-analog-digital-passthru
Some home-theatre system receivers will convert a variety of formats. I have not used this method.
To capture video from a mobile device, use a streaming program like Airplay or Airserver http://www.airserver.com/Mac, or an adapter cable like http://store.apple.com/us/product/MD098AM/A/apple-digital-av-adapter.
With MAC OS Yosemite Airplay capability might be built in?
Macam supports a variety of web cams. http://reactivemusic.net/?p=19000
Here are some of my favorite ways to generate and process video.
Composite and S-video IO.
Many camcorders have composite output. Often they will use a proprietary AV adapter cable. Classic Sony camcorders use an adapter cable with 3.5 mm plug connecting to 2 or 3 RCA plugs for composite video and audio.
Camcorders are the most versatile and interesting devices for synthesis. They are instant feedback machines. Just point the camera at the screen. Zoom for instant hallucinations. Some camcorders have filters that can be applied in real time.
Older VHS VCR’s and tapes are inexpensive. Sometimes free. They provide an infinite variety of input signals. As well as a variety of signal conversion functions. Look for one that has a remote control.
Use an RF modulator (see above) to watch composite video on a very old TV. You may also need an impedance transformer if the TV has screw terminals instead of a coax jack.
Here’s a Max patch running on a portable TV: http://reactivemusic.net/?p=18716
The Atari Video Music generates geometric patterns from audio input. http://reactivemusic.net/?p=19004. It features a TV output, although oddly uses an RCA plug. Use an RF demodulator (see above) or a VCR to convert to composite.
BPMC produces circuit bent versions of consumer video devices. I have been using the BPMC “Premium Cable” device. It is an amazing effects processor. It has composite IO (and S-video).
A video version of MS-paint. It has composite output and is fun to draw with. The AC adaptor can be difficult to find. But it also works with D batteries.
Various examples: http://reactivemusic.net/?p=7769
Sonographic sound processing in Max http://reactivemusic.net/?p=16887
(Example of 3d speech processing at 4:12 in video)
local file: SSP-dissertation/4 – Max/MSP/Jitter Patch of PV With Spectrogram as a Spectral Data Storage and User Interface/basic_patch.maxpat
Try recording a short passage, then set bound mode to 4, and click autorotate
The Eye and the Ear: http://reactivemusic.net/?p=18707
sound effects: https://www.youtube.com/watch?v=D_Ao7EKTNiA
Sometimes we overlook easy paths:
jit.ameba : http://reactivemusic.net/?p=18934
links to analog video
Algorithmic composition and generative music – part 1
That’s a lot of buzzwords.
Chris Dobrian http://reactivemusic.net/?p=18914
The earliest examples of algorithmic composition applied mathematics to pitch, rhythym, harmony, and ensemble playing. Midi was an ideal medium for mathematical transformations. The examples we look at today are for the most part Midi based.
Design a generative music machine.
I would encourage you to collaborate. To use the work of other artists as a starting point. And to build a composition/performance tool that you would actually use.
This assignment will be due on the last class day of the semester (May 5th).
Various ways Kinect 1 still runs in Mac OS with Max/MSP, Processing, and OSC.
Note: this is about the ‘old’ Kinects – not the latest versions (Kinect 2). Although, Dale Phurrough’s Max dp.kinect2 external works with Kinect 2 in Windows 8+.
(self portrait with Synapse)
Synapse converts skeletal data to OSC. It still runs with Max/MSP even though it is not supported. http://synapsekinect.tumblr.com
Here’s how to run with Max: http://synapsekinect.tumblr.com/post/6307752257/maxmsp-jitter
Don’t be surprised if Max crashes occasionally.
Max/MSP examples of skeletal tracking
Processing uses OpenNI library. Use the Processing package manager to install or update the OpenNI library.
There are several built-in examples (under Contributed Libraries). Many of them work, including the “hands” example. Wave your hand a lot to get it to start tracking.
Dale Phurrough’s free Max external jit.openni is no longer supported. I was not yet able to find a Mac version that runs. The dp.kinect external runs only in Windows.
dp.kinect is for Kinect 1 and up to Windows 7. dp.kinect2 requires Windows 8+. More testing on the way. Note that dp.kinect is a commercial product.
Provides depth camera data as a Jitter matrix. Various modes, including IR.
A tutorial by Peter Elsea: ftp://arts.ucsc.edu/Pub/ems/electronic-contraptions/Max%20and%20Kinect.pdf
Jon Bellona OSC/Kinect libraries for Processing
I was able to run the Processing sketch and receive OSC data on port 8000 in Max – but the UI is somewhat confusing and there is no camera input to monitor skeleton tracking. This probably would not be difficult to add to the sketch by looking at the SimpleOpenNI examples.