“Reactive music, a non-linear form of music that is able to react to the listener and his environment in real-time.[2] Reactive music is closely connected to generative music, interactive music, and augmented reality. Similar to music in video-games, that is changed by specific events happening in the game, reactive music is affected by events occurring in the real life of the listener. Reactive music adapts to a listener and his environment by using built in sensors (e.g. camera, microphone, accelerometer, touch-screen and GPS) in mobile media players. The main difference to generative music is that listeners are part of the creative process, co-creating the music with the composer. Reactive music is also able to augment and manipulate the listeners real-world auditory environment.[3]
What is distributed in reactive music is not the music itself, but software that generates the music…”
InstantDecomposer is an update of Slice//Jockey. It has not been released publicly. Slice//Jockey runs on Mac OS, Windows, and Linux – including Raspberry-PI
As of iOS 8.2, Dark Knight crashes on load. Inception only works with “Reverie Dream” (lower left corner)
Running RJDJ scenes in Pd in Mac OS X
Though RJDJ is a lost relic in 2015. It still works in Pd. The example scenes used here are meant to run under libpd in iOS or Android, but they will actually work in Mac OS X.
First, use Pd Extended. Ok maybe you don’t need to.
4. Add these folders in RJLIB to your Pd path (in preferences)
pd
rj
deprecated
5. Now, try running the scene called “echelon” from the sample scenes you downloaded. It should be in the folder rjdj_scenes/Echelon.rj/_main.pd
turn on audio
turn up the sliders
you should here a bunch of crazy feedback delay effects
Note: with Pd-extended 0.43-4 the error message: “base: no method for ‘float'” fills the console as soon as audio is turned on.
Scenes that I have got to work:
The ones with marked with a * seem to work well without need for modification or an external GUI. They all have error messages) – and they really are meant to work on Mobile devices, so a lot of the sensor input isn’t there.
Amenshake (you will need to provide accelerometer data somehow)
Atsuke (not sure how to control)
CanOfBeats (requires accelerometer)
ChordSphere (sounds cool even without accelerometer)
A scene from a sea otter video processed using ‘Premium Cable’ by BPMC
Not modular
Two weeks ago Christopher Konopka demonstrated modular analog video. This week we will explore analog video made from found objects and circuit bending. https://reactivemusic.net/?p=18982
Kinect
Kinect is still your friend, especially if you use Windows.
Composite uses the yellow RCA plugs found on most consumer video devices. It is also the standard for LZX analog modular systems http://www.lzxindustries.net.
Because composite is the most common format, the approach here will be to convert everything else to composite. With a few exceptions.
S-video
S-video uses several channels to convey AV data. The easiest way to convert S-video to composite, or vice-versa is to use a VCR or capture device that has s-video IO.
VHF analog TV
Older video devices use coaxial cables with F connectors to transmit modulated AV signals on channels 3 and 4.
RF modulators and demodulators
RF modulators
To convert from composite to TV use an RF modulator:
To convert from TV to composite, use an RF demodulator. The demodulator removes the VHF carrier, leaving a baseband composite signal.
Demodulators can be expensive and difficult to find, so if you are not too concerned about image quality its easier to use a VCR. Connect the input signal to the TV coax input. Set the VCR to channel 3 or 4. The signal passes through to the composite output (yellow RCA jack). This method also works as an inexpensive method of TBC (time base correction).
Here are some of my favorite ways to generate and process video.
Videonics Video Equalizer
Composite and S-video IO.
Camcorders with composite output
Many camcorders have composite output. Often they will use a proprietary AV adapter cable. Classic Sony camcorders use an adapter cable with 3.5 mm plug connecting to 2 or 3 RCA plugs for composite video and audio.
Camcorders are the most versatile and interesting devices for synthesis. They are instant feedback machines. Just point the camera at the screen. Zoom for instant hallucinations. Some camcorders have filters that can be applied in real time.
VCR’s
Older VHS VCR’s and tapes are inexpensive. Sometimes free. They provide an infinite variety of input signals. As well as a variety of signal conversion functions. Look for one that has a remote control.
Ancient analog TV’s
Use an RF modulator (see above) to watch composite video on a very old TV. You may also need an impedance transformer if the TV has screw terminals instead of a coax jack.
The Atari Video Music generates geometric patterns from audio input. https://reactivemusic.net/?p=19004. It features a TV output, although oddly uses an RCA plug. Use an RF demodulator (see above) or a VCR to convert to composite.
BPMC produces circuit bent versions of consumer video devices. I have been using the BPMC “Premium Cable” device. It is an amazing effects processor. It has composite IO (and S-video).
My First Sony
A video version of MS-paint. It has composite output and is fun to draw with. The AC adaptor can be difficult to find. But it also works with D batteries.
The earliest examples of algorithmic composition applied mathematics to pitch, rhythym, harmony, and ensemble playing. Midi was an ideal medium for mathematical transformations. The examples we look at today are for the most part Midi based.
I would encourage you to collaborate. To use the work of other artists as a starting point. And to build a composition/performance tool that you would actually use.
This assignment will be due on the last class day of the semester (May 5th).
“The jit.ameba object began life as a completely failed attempt at a linear interpolation object. As the Oblique Strategies say, “Honor thy error as a hidden intention.” If you’re interested in linear interpolation objects, try the jit.plur object. Otherwise, please enjoy the mess.”
“This is a pixilation (resampling) trick. The help file admits this was supposed to be something else, but didn’t work. Needs a spell check too, but pretty interesting. ”
Using the text processing example I built a 4th order Markov chain for “The Cat in The Hat” (by Dr. Seuss). Here is some of the result (it continues to infinity) with indentation added for readability:
The sat the sun is not shine.
It was to Sit! Sit!
And I sat in the house
All that?
How I wish We had something went BUMP! How I wish We did not shine.
It was to us jump! We looked!
The Cat in that cold to go out And wet to play.
So all we sat in the Hat! And he saw him!
Then we sat is wet to Sit! Sit!
And to go out
And the house
All that?
Why do was too wet And there little bit. And wet to go out
And there we saw him step in on that?
How I wish We sat is nothing at all.
So all we could do was too wet day.
So all we sat the house
All that is fun there we can have Lots of good fun the mat!
We had something at all.
So wet to go out And the sun is wet to Sit! Sit!
And I said too wet to do I know it is not sunny!
Too we can have Lots of good funny!
Too wet And I sat is fun the house. We sat the house. We saw him!
The Cat is wet to us.
…
The help file patch allows real time Midi improvisation with a step sequencer style of playback: