( previous : Up: erase weblog next )

Generative Notation and Hacking The Quartet

The last few years have seen a proliferation of hack days, in which participants spend a day or two sketching and building prototypal ideas with code. For me, the most appealing are those that deal with a specific concept, with participants given free reign to explore a small zone of creative ideas -- often a more inspiring starting point than a series of data sets.

Thus, it was impossible to resist the allure of Hack The Quartet: a two-day event hosted by Bristol's iShed, which gave guests the rare opportunity to spend two days working closely with a world-class string quartet. The event brief sums up part of the appeal really nicely:

A quartet is like a game of chess; simple in its make up and infinite in its possibility. So how can new technologies be used to augment performance of and engagement with chamber music?

In my mind, there's a perfect balance in the relative constraints of this ensemble size, coupled with the opportunity to link the richness of virtuoso musicianship with the possibilities for algorithmic augmentation. I've been thinking a lot about these ideas since writing The Extended Composer but it's rare to be able to put them into practice in a live environment, particularly with players of the calibre of the Sacconi Quartet.

Generative Notation

I went into Hack The Quartet with an unusually well-formed idea: to create a tablet-based system to render musical notation in real-time, based on note sequences received over a wireless network. Though there are plenty of iPad score display apps out there, the objective here was to begin with a set of empty staves, onto which notes materialise throughout the performance.

The potential uses for this are manifold. Imagine the situation in which a dancer's movements are tracked by camera, with a piece of software translating their motions into musical shapes. These could be rendered as a set of 4 scores - one for each musician - and performed by the quartet in real-time, creating a musical accompaniment which directly mirrors the dancers' actions.

Of course, we can substitute dancers' movements for sensor inputs, web-based data streams, or any kind of real-time information source. The original motivation for the project came out of discussions surrounding The Listening Machine, which translated online conversations into musical sequences based on an archive of around 50,000 short samples, each representing a single word or syllable. Creating a sonification system based on fragments of pre-recorded audio was all very well, but imagine the fluidity and richness of interpretation if The Listening Machine's sentence-derived scores were performed live by skilled musicians: almost as if the instrument itself were speaking a sentence.

For Hack The Quartet, I worked closely with the all-round sonic extraordinaire Nick Ryan to devise a set of compositional processes that we could realise over the two days, which we continued to discuss in depth with the Sacconi players. Given the boldness and risk inherent with playing a score that is being written at the moment it is played, the quartet's confidence and capability in performing these generative sequences was quite remarkable. The resultant piece included complex, shifting polyrhythms, with algorithmically-generated relationships between note timings, which didn't phase the players in the slightest.




Visually, the notated outcome is surprisingly crisp and satisfactory. With Socket.io for real-time communications, isobar for algorithmic pattern generation, plus a quartet of Retina iPads, we now have a framework that is sufficiently stable and flexible to explore the next steps of live score generation.


for n in range(16):
   p = PWChoice(notes, [ n ] + ([ (15-n) ] * (len(notes)-1) ))
   events = PDict({ "note" : p, "dur" : 6, "amp" : 64 })
   event = events.next()

   if n == 0:
     event["annotation"] = "Arco pp, slow trill"
   else:
     event["annotation"] = "%d" % bar_number

   output.event(event)
   bar_number += 1

And the sense of hearing a nuanced rendition of a living score, never before heard, was simply something else. Having only just got my breath back from last-minute technical challenges (never, ever use Bluetooth in a demo setting. Ever.), it was just gripping to hearing our code and structures materialise as fluttering, breathy bowed notes, resonating through the bodies of the Quartet's antique instruments. Despite the mathematical precision of the underlying processes, the results were brought to life by the collective ebb and flow of the performers' pacing and dynamics.

With so many elements open to exploration, it is an approach that could bear a seemingly endless number of further directions. It feels like the start of a new chapter in working with sound, data and performance.

Thanks to Peter Gregson for his invaluable advice on score engraving, and Bruno Zamborlin, Goldsmiths' EAVI group and the iShed for iPad loans. Special thanks to all at the Watershed for hosting Hack The Quartet, and to the Sacconi Quartet for their exemplary patience and musicianship.