The Listening Machine is an orchestral sonification of the online activity of several hundred (unwitting) UK-based Twitter users. Created with cellist Peter Gregson and Britten Sinfonia, it has been a vast adventure combining studio recordings with a chamber ensemble, countless hours of coding towards a growing generative compositional toolkit, and delving into the mechanics of linguistics, prosody, and natural language processing.
Key to the compositional process is a system to translate the flow and rhythm of a text passage into a musical score, based on ordering the formant frequencies of the human voice, which characterise the qualities of each vowel sound. We determine the piece's musical mode via sentiment mapping, and then generate individual note-wise patterns by translating syllables into notes in the current scale. As several Twitter users are typically active at the same time, the result is multiple, intertwining melody lines, tonally related but structurally distinct.
The Listening Machine launched at the start of last month as part of The Space, a great new BBC/Arts Council initiative encouraging National Portfolio organisations into the realm of online content. With a team of BBC broadcast technology ninjas, our contribution is a piece of music which lasts 6 months and is quintessentially digital: using data sourced from internet discussions, and streamed solely over the web.
But maybe the most exciting part has been the combination of algorithmic processes with thousands of fragments of orchestrally-recorded refrains. The objective was always to create a piece of music which sounded organic, and -- in spite of its metronomic pulse -- the results aren't too far from what we envisaged. See the website for information about the compositional process.
The other integral part of the project is the graphic design, created by the excellent Joe Hales. Joe is more typically found creating design for print, and we wanted to translate this page-based aesthetic to the screen, presenting the project almost as if it were a textbook.
With some judicious JSON and HTML 5 <canvas> voodoo, we animated his cog-and-dial visualisation to present a continuous representation of The Listening Machine's state at any point. The collective's mood, activity rate and topics of conversation are displayed live on thelisteningmachine.org, similarly reflected in the musical output.
The Python code behind the algorithmic composition parts is available on github.com/ideoforms/isobar; the text analysis framework will be released in due course.
The Listening Machine can also be found on Twitter @listenmachine and facebook.com/thelisteningmachine.