top | item 38198388

(no title)

nologic01 | 2 years ago

Something I noticed about all these projects (and they are quite a few) is that they are quite old (some going back ~2 decades or so).

I wonder what is the dynamic behind that longevity. Music hasn't changed ofcourse but on tech side I would think there are significant new possibilities.

Is it something related to the difficulty of implementing low level algorithms (famously a lot of linear algebra stuff goes back decades and rests on C / Fortran libraries). Is it that there isn't enough interest to justify new efforts or are those systems already near "perfection"?

discuss

order

Rochus|2 years ago

> some going back ~2 decades or so

Some indeed started long ago; e.g. Common Music had its roots in the eighties and became a pretty popular composition environment during the nineties; in 2008 there was even a fundamental redesign and it's still developed. But there was also a certain abundance of new tools; many were released that essentially had the same goal and offered the same features, just implemented a little differently.

> Music hasn't changed ofcourse

Music and the way music is composed, produced and consumed has changed tremendously over the years, and so have the tools. In the past the focus was on algorithmic composition, and tools allowed to extend the possibilities of a composer also to sound design, but it took years until computers were fast enough to render a composition in real-time. Then came the time of the DAW's when eventually everyone could produce music with little investments. In the last ten years live coding became popular which is yet another way of composing and producing music, with new requirements for interactivity and ergonomic efficiency of the tools, and yet a different view of the composer and the composition process.

iainctduncan|2 years ago

However, and this is something I wrote about recently in my thesis, we are now experiencing somewhat of a renaissance of older approaches as it is only fairly recently that it has become practical to run the older (lisp-based) algorithmic tools in real-time. I can run Scheme during live playback and have the GC finish it's business fast enough to use it with very acceptable audio latencies within Ableton Live for example. I'm just now embarking on a PhD in this area actually, and its pretty exciting how many previously dusty things can be used in exciting new ways.

I discuss this a bit here: https://iainctduncan.github.io/papers/introduction.html