Haile, the interactive robot drummer
[via Atom Jack]
From Georgia Tech’s Gil Weinberg and Scott Driscoll.
Early examples of interactive computer music systems go back 20-30 years, to the 80s after MIDI was invented. Real-time computer listening/responding (‘machine musicianship’) started to be interesting for many serious researchers in the 1990s, made possible by rapid growth in CPU power. (In the 60s even the fastest computers could only generate around 5000 samples of a single note per second.) Of course, for quite a while the music in many video games has responded, as play evolves, in a pre-programmed way… no musicianship involved.
It may not be that long before popular music artists can release music “written” in a form that lets listeners “remix” according to their whims in real-time. Quite a potential game-changer.