r _Web.log

tag: sonification

The Markup Melodium

I was recently invited by Mozilla to be a fellow on their Webmaker program, an excellent initiative to foster web literacy. As part of the fellowship, I was asked to create something which exploited the affordances of their maker tools.

I was drawn to the immediacy of Thimble, a browser-based interface to write web code and immediately see the results. I began pondering the potential for using Thimble as a kind of live coding environment: could an HTML document be translated into a piece of music which could be edited on-the-fly, hearing an immediate reflection of its structure and contents?

The outcome is this: The Markup Melodium. Using jQuery and Web Audio, it traverses the DOM tree of an HTML page and renders each type of element in sound. In parallel, it does likewise for the text content of the page, developing the phoneme-to-tone technique we used in The Listening Machine.

In way of example, hear Lewis Carroll's Jabberwocky as rendered by the Melodium. To explore the basic elements, here is a short composition for the Melodium. And the really exciting part: using Thimble's Remix feature, you can clone this basic composition and immediately begin developing your own remix of it in the browser, before publishing it to the world.

As the Markup Melodium is implemented through pure JavaScript, it's also available as a bookmarklet so that you can sonify arbitrary web pages.

Drag the following link to your browser's bookmark toolbar: Markup Melodium.

And, of course, all of the code is available on github.

The name is in tribute to The Melodium, a 1938 musical instrument created by German physicist Harald Bode, whose pioneering modular designs anticipated today's synthesizers by many decades

Hearing Connections at the Royal Institution

K http://www.rigb.org/contentControl

I'm giving a talk next week as part of the excellent-sounding Hearing Connections, an evening of lectures on sonification and networks. It's part of a series of events at the Royal Institution of Great Britain, the 200-year-old establishment that Faraday and Medawar once called home. So, no pressure then.

I'll be discussing the relationships between sound and ecosystems, giving a whistle-stop tour of emergence, nested hierarchies and complexity, via Wolfram and Stockhausen, and hopefully culminating in a demo of some exciting new multi-level simulation work that I've been developing.

Here's the abstract:

What does a concerto have in common with a coral reef? The answer is that both are made up of nested hierarchies, in which an individual on one layer contains a population of the one below. An ecosystem comprises of multiple species, each of which contains multiple communities, made up of multiple individuals -- and an individual is itself an ecosystem of organs, cells and microbes. Likewise, a concerto comprises of movements, which comprise of parts, which comprise of notes and harmonies.

This talk is a brief tour around the relationships between music and ecology, and how their similarity can be used as a fruitful way to illuminate both our scientific and artistic practices.

  • Can translating a real ecosystem into sound reveal hidden properties to us?
  • Can the dynamics of an ecosystem be thought of as creative, or teach us about creativity?
  • Can there be a single set of simple rules that unify all of these levels collectively?

Hearing Connections runs from 7pm on Tuesday 15 November.
More information and tickets on the Royal Institution's website.