Cool idea! There are a few platforms which do tackle that kind of concept (I reckon SoundStorming is the most recent one that got some traction), though it's a really good moment to come in with a brand new take on it!
The Web Audio API has matured a lot in the last few years, and the technical limitations regarding collaborative music are more a matter of execution right now- interfaces (and their dreaded ASIO drivers :)) also got better and it's easy to capitalize on that in so many ways while keeping latency low, but those without em or a quality mic get cast aside when it's now possible to post-process less aggressively if the conditions to do (most importantly placement) so are communicated. While ConvolverNode shenanigans in Web Audio won't come close to the granularity of popular reverb VSTs, there's plenty of musicians who would see much more value in creating with effects on the fly, especially those who didn't figure DAWs out yet to a good extent and I think paired with clever spatialization techniques for the tracks bearing in mind parameters related to effects and mixes (highlighting those timeline-style on top of spectrograms could make browsing tracks very interesting, along/alternatively filtering by vectorized features) are examples of what can easily popularize such a project given that people who use DAWs already share stems and mixes to collab asynchronously on Discord threads,
it's a win-win to have the two demographics in one place readily available to create together.Sorry about the long answer, most of it is just context in case someone else less accustomed with music finds this thread interesting (oh, also Audio Worklets are really cool but that would warrant another 3 paragraphs hahaha) :). Despite the amount of similar neat options available, hope you decide to make it anyway! It has potential and would recommend it to people for sure.
No comments yet.