r _Web.log

tag: development


Chirp: A platform for audible data

Over the past few months, I've had my head down working at Animal Systems on a tremendously exciting new platform by the name of Chirp. In a nutshell, Chirp is a way to treat sound as data, enabling devices to communicate with each other using short packets of audio. A sender emits a series of tones; a receiver hears and decodes them, translating them into a code which can point to a picture, text, URL, or even another piece of sound.

Chester

Chester, the bird-robot-hybrid avatar of Chirp

My work has been focused on developing an iOS app which will very shortly be seeing the light of day, App Store pending. The experience is simple: Alice want to send a picture to Bob, so she imports it into Chirp, hits a button, and the device chirps it (a sound like this). Bob's phone, and any nearby devices within earshot, can then decode the chirp and display the image. No painful Bluetooth pairing, no typing of email addresses, no USB-stick fiddling.

Of course, the system isn't breaking the laws of entropy and cramming a large JPEG into a second of audio: behind the scenes, the data itself is transferred to a cloud infrastructure and translated into a "shortcode", which is then sent over sound, decoded and resolved. There's an inherently low bitrate in a noisy sonic environment. But then, the bitrate of human speech is estimated at less than 100bps, and spoken language has turned out to be quite a useful feature.

One of the big lessons for me has been the sheer amount of engineering required for a magically simple transaction. Developed from conversations about the information-theoretic properties of avian linguistics, Chirp screenshot Chirp's audio system has been honed over countless months by a team of DSP gurus based in Barcelona, with an array of simulations operated from UCL's Legion supercomputing cluster, rendering it resilient to hostile reverberant and noisy conditions; the underlying network consists of an infinitely-scalable REST API that we have designed over many iterations, developed by a team of inveterate network architects and now residing in the cloud. The inverse correlation between intuitive simplicity and actual complexity, in the tech domain at least, couldn't be clearer here.

The app is an exploratory first step, and there are almost too many next steps to contemplate. Anything that can transmit sound can send a chirp, so we've been experimenting with all sorts of lo-fi devices: the joy of sending a YouTube video link via a dictaphone is pretty much unrivalled. Throw an Arduino into the equation and suddenly there's an explosion of possibilities of conversing machines.

And there's an equal amount of philosophical potential in this research. Suddenly, the dumb alert tones produced by phones, lorries and fire alarms seem absurd. Why aren't these designed for machine as well as human ears, conveying valuable information about the state of the world? Why is the visual given default primacy as an information medium? And what happens when the typical silence of network communications are suddenly tangible, embodied, and broadcast?

Chirp will be free on the Apple iOS App Store.

Hackpact 2009/09/#1: iPhone + oF multitouch alpha

iphone-circles.jpg

One week ago, I caved to the temptation and got an iPhone 3G S. Today, as the first of 30 hackpact creations, I installed the SDK plus memo's iPhone extensions for openFrameworks and hacked together my first app. OpenGL alpha blending + multi-particle grabbing/throwing /bouncing = fun.

Next step is to go through the arduous process of enrolling in the iPhone Developer Program in order to be able to even test it on the device itself. Sigh.

Hackpact, September 2009

Over the next 30 days, I'm participating in what has been termed a hackpact, a notion suggested by Alex a few days ago and immediately adopted by a decent group of others.

The concept is simple: create something and publicly document it, each day for a month. It's a kind of distributed cousin of Thing-A-Day, though inspired by a quite different sort of practice.

I'll thus be posting daily with an image and a paragraph or so on whatever I have produced. Many things should be created from scratch; some will be works in progress, and others will be (unashamedly) cleaning up and publishing the jumble of half-finished work that languish around various hard disks.

I've also created a personal Twitter profile for the purpose, where I'll be posting daily with the #hackpact tag. Alternatively, follow this list of hackpact-tagged blog posts.

Profiling Java and Processing code on Eclipse/OS X

K http://ninjamonkeys.co.za/.../java-performance-profiling-on-mac-for-free-using-shark/

Shark profiler I've been trying to step up my coding game by moving from vim and Processing's straightforward interface to the Eclipse IDE. Having followed the comprehensive Processing in Eclipse howto, the advantages have immediately been manifold: brilliant code refactoring tools, nice javadoc-generation functions, an inbuilt debugger, and svn version control integration with Subclipse.

Best of all, however, was stumbling across this guide to Java code profiling with Shark. The agent component of Eclipse's Test and Performance Tools is sadly unsupported on OS X, but this solution - using part of Apple's free Developer Tools - fits the bill perfectly. Just got an instant breakdown of the execution bottlenecks of my current Processing app, and am well on the way to a turbocharged speed boost...