Category: Uncategorized


Voltage controlled variable capacitors

Also called varicaps. They are diodes operated in a reverse bias condition. As voltage increases the capacitance decreases.

Read this first

Tutorial by Phillip Atchely, KO6BB

More tutorials

Using Varactors by Stefan Hollos and Richard Hollos:

Tutorial by Ian Poole at Radio-Electronics 

Another tutorial from Radio-Electronics?:

Threads from The RadioBoard Forum:

Varactor tuned regenerative radio by Tony G4WIF “The Two Dollar Regen” : … ntid=28430 More information here:


Slice//Jockey help

A compilation of Pd help screens.

Slice//Jockey is an interactive audio performance instrument

by Katja Vetter

Screen Shot 2015-04-19 at 5.03.09 PM


Click any of the screenshots to see a full size image.

Screen Shot 2015-04-19 at 4.57.49 PM

X-Y field

The x-y origin is in the lower left corner. In the x plane numbers increase from left to right. In the y plane numbers increase from bottom to top.

  • white: left slice unit
  • grey: right slice unit
  • black: global

Screen Shot 2015-04-19 at 5.59.03 PM

Notes on notes:

  • x: pitch low to high (independent of BPM)
  • y: rhythmic variation low to high

Screen Shot 2015-04-19 at 4.58.27 PM

Screen Shot 2015-04-19 at 4.59.05 PM

Screen Shot 2015-04-19 at 4.59.17 PM

Screen Shot 2015-04-19 at 5.01.49 PM

Screen Shot 2015-04-19 at 5.02.03 PM

Screen Shot 2015-04-19 at 5.02.13 PM

Screen Shot 2015-04-19 at 5.02.36 PM

Screen Shot 2015-04-19 at 4.58.52 PM

Audio Settings

Screen Shot 2015-04-19 at 5.01.27 PM

Screen Shot 2015-04-19 at 4.59.33 PM

 IO section

Screen Shot 2015-04-19 at 5.00.32 PM

Screen Shot 2015-04-19 at 5.00.41 PM

Screen Shot 2015-04-19 at 5.00.58 PM

Screen Shot 2015-04-19 at 5.00.49 PM


Patterns and slices

Slice units left and right

Screen Shot 2015-04-19 at 4.59.57 PM

Screen Shot 2015-04-19 at 5.56.19 PM

Screen Shot 2015-04-19 at 5.00.06 PM

Screen Shot 2015-04-19 at 5.00.17 PM

Screen Shot 2015-04-19 at 5.53.28 PM

Screen Shot 2015-04-19 at 5.53.08 PM

Global recorder

Screen Shot 2015-04-19 at 4.59.42 PM

ep-341 Max/MSP – Spring 2015 week 13

Algorithmic composition and generative music – part 2


Reactive music

With reactive music, audio is the input. Music is the output. Music can also be the input.

from Wikipedia:

“Reactive music, a non-linear form of music that is able to react to the listener and his environment in real-time.[2] Reactive music is closely connected to generative musicinteractive music, and augmented reality. Similar to music in video-games, that is changed by specific events happening in the game, reactive music is affected by events occurring in the real life of the listener. Reactive music adapts to a listener and his environment by using built in sensors (e.g. cameramicrophoneaccelerometertouch-screen and GPS) in mobile media players. The main difference to generative music is that listeners are part of the creative process, co-creating the music with the composer. Reactive music is also able to augment and manipulate the listeners real-world auditory environment.[3]

What is distributed in reactive music is not the music itself, but software that generates the music…”

Ableton Live field recorder

Uses dummy clips to apply rhythmic envelopes and effects to ambient sound:

InstantDecomposer and Slice/Jockey

Making music from from sounds that are not music.

by Katja Vetter

InstantDecomposer is an update of Slice//Jockey. It has not been released publicly. Slice//Jockey runs on Mac OS, Windows, and Linux – including Raspberry-PI

Slice//Jockey help:

Slice//Jockey is written in Pd (PureData) – open source – the original Max.

By Miller Puckette

Local file reference

  • local: InstantDecomposer version: tkzic/pdweekend2014/IDecTouch/IDecTouch.pd
  • local: slicejockey2test2/slicejockey2test2.pd


A music factory.

By Christopher Lopez

Inception and The Dark Knight iOS apps:

As of iOS 8.2, Dark Knight crashes on load. Inception only works with “Reverie Dream” (lower left corner)

Running RJDJ scenes in Pd in Mac OS X

Though RJDJ is a lost relic in 2015. It still works in Pd. The example scenes used here are meant to run under libpd in iOS or Android, but they will actually work in Mac OS X.

Screen Shot 2015-04-18 at 10.13.01 PM

First, use Pd Extended. Ok maybe you don’t need to.

1. Read the article from Makezine by Mike Dixon

2. Download sample scenes from here: The link is under the heading “RJDJ Sources”

3. Download RJLIB from here:

4. Add these folders in RJLIB to your Pd path (in preferences)

  • pd
  • rj
  • deprecated

5. Now, try running the scene called “echelon” from the sample scenes you downloaded. It should be in the folder rjdj_scenes/Echelon.rj/_main.pd

  • turn on audio
  • turn up the sliders
  • you should here a bunch of crazy feedback delay effects

Note: with Pd-extended 0.43-4 the error message: “base: no method for ‘float'” fills the console as soon as audio is turned on.

Scenes that I have got to work:

The ones with marked with a * seem to work well without need for modification or an external GUI. They all have error messages) – and they really are meant to work on Mobile devices, so a lot of the sensor input isn’t there.

  • Amenshake (you will need to provide accelerometer data somehow)
  • Atsuke (not sure how to control)
  • CanOfBeats (requires accelerometer)
  • ChordSphere (sounds cool even without accelerometer)
  • Diving*
  • DubSchlep* (interesting)
  • Ehchelon*
  • Eargasm*
  • Echochamber*
  • Echolon*
  • Flosecond (requires accelerometer)
  • FSCKYOU* (Warning, massive earsplitting feedback)
  • Ghostwave* (Warning, massive earsplitting feedback)
  • HeliumDemon (requires accelerometer)
  • JingleMe*
  • LoopRinger*
  • Moogarina
  • NobleChoir* (press the purple button and talk)
  • Noia*
  • RingMod*
  • SpaceBox*
  • SpaceStation (LoFi algorithmic synth)
  • WorldQuantizer*

to be continued…

Random RJDJ links

Including stuff explained above.



I’m thinking of something:

Analog video synthesis

Vintage consumer electronics, adapter cables, converters, and circuit bending.


Signal formats

Note: everything here refers to NTSC which is the standard video format in the USA.


“Composite video (one channel) is an analog video transmission (without audio) that carries standard definition video typically at 480i or 576i resolution.” -from Wikipedia

Composite uses the yellow RCA plugs found on most consumer video devices. It is also the standard for LZX analog modular systems

Because composite is the most common format, the approach here will be to convert everything else to composite. With a few exceptions.


S-video uses several channels to convey AV data. The easiest way to convert S-video to composite, or vice-versa is to use a VCR or capture device that has s-video IO.

VHF analog TV

Older video devices use coaxial cables with F connectors to transmit modulated AV signals on channels 3 and 4.

RF modulators and demodulators
RF modulators

To convert from composite to TV use an RF modulator:


You can also use a VCR as an RF modulator.

RF modulators can be used as TV transmitters.

RF demodulators

To convert from TV to composite, use an RF demodulator. The demodulator removes the VHF carrier, leaving a baseband composite signal.

Demodulators can be expensive and difficult to find, so if you are not too concerned about image quality its easier to use a VCR. Connect the input signal to the TV coax input. Set the VCR to channel 3 or 4. The signal passes through to the composite output (yellow RCA jack). This method also works as an inexpensive method of TBC (time base correction).


Use a PC to TV converter to convert VGA signals to composite:

Screen Shot 2015-03-27 at 5.02.22 PM

Video capture

There are many options for video capture depending on which operating system and applications you are using.

Here are several methods that work with MAC OS

Capture devices
  • Elgato Video Capture – USB (only works through Elgato app)
  • Diamond Video Capture – USB (shows up as a video camera?)
Media converters

Grass Valley ADVC 110 Digital Video Converter – Firewire 400

Analog to digital pass-through

Some camcorders work as digital media converters. It may be difficult to determine which ones do this.   Here’s a list of Sony camcorders that have this capability:

Another list:

Some home-theatre system receivers will convert a variety of formats. I have not used this method.

iOS, Android

To capture video from a mobile device, use a streaming program like Airplay or Airserver, or an adapter cable like

With MAC OS Yosemite Airplay capability might be built in?

Web cams

Macam supports a variety of web cams.


Here are some of my favorite ways to generate and process video.

Videonics Video Equalizer


Composite and S-video IO.

Camcorders with composite output


Many camcorders have composite output. Often they will use a proprietary AV adapter cable. Classic Sony camcorders use an adapter cable with 3.5 mm plug connecting to 2 or 3 RCA plugs for composite video and audio.

Camcorders are the most versatile and interesting devices for synthesis. They are instant feedback machines. Just point the camera at the screen. Zoom for instant hallucinations. Some camcorders have filters that can be applied in real time.



Older VHS VCR’s and tapes are inexpensive. Sometimes free. They provide an infinite variety of input signals. As well as a variety of signal conversion functions. Look for one that has a remote control.

Ancient analog TV’s

Portable TV

Use an RF modulator (see above) to watch composite video on a very old TV. You may also need an impedance transformer if the TV has screw terminals instead of a coax jack.

Here’s a Max patch running on a portable TV:

Atari Video Music c-240


The Atari Video Music generates geometric patterns from audio input. It features a TV output, although oddly uses an RCA plug. Use an RF demodulator (see above) or a VCR to convert to composite.

BPMC  circuit bent analog video


BPMC produces circuit bent versions of consumer video devices. I have been using the BPMC “Premium Cable” device. It is an amazing effects processor. It has composite IO (and S-video).

My First Sony


A video version of MS-paint. It has composite output and is fun to draw with. The AC adaptor can be difficult to find. But it also works with D batteries.


ep-426 Interactive Video – Spring 2015 week 12

(under construction)

Various topics

Screen Shot 2015-03-11 at 10.14.38 PM


Camille Utterback
Luke Dubois

Various examples:

Brian Eno
Tadej Droljc

Sonographic sound processing in Max

(Example of 3d speech processing at 4:12 in video)

local file: SSP-dissertation/4 – Max/MSP/Jitter Patch of PV With Spectrogram as a Spectral Data Storage and User Interface/basic_patch.maxpat

Try recording a short passage, then set bound mode to 4, and click autorotate

Blair Neal

Ancient history

Don Slepian 1983


Franciszka and Stefan Themerson 1944/45

The Eye and the Ear:

The Outer Limits (TV) 1963

sound effects:


Sometimes we overlook easy paths:

More Jitter tricks

jit.ameba :







Analog modular video links:

links to analog video


ep-341 Max/MSP – Spring 2015 week 12

(under construction)

Algorithmic composition and generative music – part 1

Screen Shot 2015-04-13 at 11.09.59 PM

  • This week we will explore probability, randomness, and sequencing.
  • Next week we will look at audio reactive music
  • In part 3 We will sonify data and gestures

That’s a lot of buzzwords.

How do people compose music?
  • What is your composition tool/method of choice?
  • Dramatic shape
  • “How to Write a Song” By Henry Kane, 1962
  • “This is Your Brain on Music” by Daniel Levitin
3 approaches

Chris Dobrian

  • systematic
  • intuitive
  • arbitrary
Framing the process of composition

The earliest examples of algorithmic composition applied mathematics to pitch, rhythym, harmony, and ensemble playing. Midi was an ideal medium for mathematical transformations. The examples we look at today are for the most part Midi based.

Composing with Max
Markov chains
  • Fuji Kureta: MarkovChain 2-fuji-kureta.maxpat (using Midi piano scores) debussy-cc4-format0.mid (The snow is dancing)
  • Mchain external by Richard Dudas –
    • text
    • interactive midi notes


Design a generative music machine.

I would encourage you to collaborate. To use the work of other artists as a starting point. And to build a composition/performance tool that you would actually use.

This assignment will be due on the last class day of the semester (May 5th).

Kinect – revisited

Various ways Kinect 1 still runs in Mac OS with Max/MSP, Processing, and OSC.

Note: this is about the ‘old’ Kinects – not the latest versions (Kinect 2). Although, Dale Phurrough’s Max dp.kinect2 external works with Kinect 2 in Windows 8+.

(self portrait with Synapse)

Screen Shot 2015-04-04 at 4.38.25 PM


Synapse converts skeletal data to OSC. It still runs with Max/MSP even though it is not supported.

Here’s how to run with Max:

Don’t be surprised if Max crashes occasionally.


Max/MSP examples of skeletal tracking


Processing uses OpenNI library. Use the Processing package manager to install or update the OpenNI library.

There are several built-in examples (under Contributed Libraries). Many of them work, including the “hands” example. Wave your hand a lot to get it to start tracking.

jit.openni and dp.kinect

Dale Phurrough’s free Max external jit.openni is no longer supported. I was not yet able to find a Mac version that runs. The dp.kinect external runs only in Windows.

dp.kinect is for Kinect 1 and up to Windows 7. dp.kinect2 requires Windows 8+. More testing on the way. Note that dp.kinect is a commercial product.


Screen Shot 2015-04-04 at 5.02.46 PM

Provides depth camera data as a Jitter matrix. Various modes, including IR.

A tutorial by Peter Elsea:



Jon Bellona OSC/Kinect libraries for Processing

I was able to run the Processing sketch and receive OSC data on port 8000 in Max – but the UI is somewhat confusing and there is no camera input to monitor skeleton tracking. This probably would not be difficult to add to the sketch by looking at the SimpleOpenNI examples.