I don’t agree with Ted Nelsons comments. Files and folders were a perfect start and they’ve been very successful. His ideas are a kind of “start again from scratch” mentality, and that’s simply not possible. We’ve gone too far already.

We need to complement existing interfaces with new accessibility and use the hardware at our disposal to do so. For example, the iphone has audio out and in, so how about analysing the “sound” of your music collection to fit the ambient sound level, or to give clues to the “mood” of where you are? Everything is net-enabled these days, and we’re not short of metadata to describe music. So these kind of connections should be effortless.

Technologically, we’re at a wonderful point in time where all these things are just becoming possible. We just need to get on with it and start making them. It’s almost as if music is on the verge of some audio equivalent of photosynth[1] being invented, and with the kind of metadata being formed by services like pandora.com[2] there are some exciting and unexpected music applications ahead.

[1] photosynth demo
[2] music genome