We finally have the (visual synthesizer) Hurricane and the (sequencer) Live Better Suite talk to each other via MiDi signal over 16 channels. This means, that live generated music can be used to live generate visuals.. not only matching the music, but actually being triggered by it.
Also, the variable signal settings like blend mode, attack and decay can now all be hooked up to a drehbank and have received a haptic interface, as opposed to tabular. This will make it possible for jana° to react visually with what Kent is generating live musically. Thus generating truly live audio-visual shows.
We are very excited and look forward to playing around with this whole new way of working..!
Here a first experiment, using visual materials we had accumulated during the last year, the 100 year anniversary of DaDA.
Find out more about the technology –> here!