“Reactive music, a non-linear form of music that is able to react to the listener and his environment in real-time. Reactive music is closely connected to generative music, interactive music, and augmented reality. Similar to music in video-games, that is changed by specific events happening in the game, reactive music is affected by events occurring in the real life of the listener. Reactive music adapts to a listener and his environment by using built in sensors (e.g. camera, microphone, accelerometer, touch-screen and GPS) in mobile media players. The main difference to generative music is that listeners are part of the creative process, co-creating the music with the composer. Reactive music is also able to augment and manipulate the listeners real-world auditory environment.
What is distributed in reactive music is not the music itself, but software that generates the music…”
4. Add these folders in RJLIB to your Pd path (in preferences)
5. Now, try running the scene called “echelon” from the sample scenes you downloaded. It should be in the folder rjdj_scenes/Echelon.rj/_main.pd
turn on audio
turn up the sliders
you should here a bunch of crazy feedback delay effects
Note: with Pd-extended 0.43-4 the error message: “base: no method for ‘float'” fills the console as soon as audio is turned on.
Scenes that I have got to work:
The ones with marked with a * seem to work well without need for modification or an external GUI. They all have error messages) – and they really are meant to work on Mobile devices, so a lot of the sensor input isn’t there.
Amenshake (you will need to provide accelerometer data somehow)
Atsuke (not sure how to control)
CanOfBeats (requires accelerometer)
ChordSphere (sounds cool even without accelerometer)