top | item 43041023

(no title)

jawon | 1 year ago

Gave it a try. After a few minutes I felt more like I was recognising the samples than I was recognising the notes. Not sure what you can do about that short of physically modeling an instrument.

discuss

order

yojo|1 year ago

Latest browser APIs expose everything you need to build a synth. See: https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_A...

There are some libraries that make it easy to simulate instruments. E.g. tone.js https://tonejs.github.io/

It should be possible to generate unique-ish variants at runtime.

westurner|1 year ago

OpenEar is built on tone.js: https://github.com/ShacharHarshuv/open-ear

limut implements WebAudio and WebGL, and FoxDot-like patterns and samples: https://github.com/sdclibbery/limut

https://glicol.org/ runs in a browser and as a VST plugin

https://draw.audio/

"Using the Web Audio API to Make a Modem" (2017) https://news.ycombinator.com/item?id=15471723

gh topics/webaudio: https://github.com/topics/webaudio

awesome-webaudio: https://github.com/notthetup/awesome-webaudio

2c2c2c|1 year ago

I am using midi and open source instrument packages, so this is all handleable. There's a few instrument options to choose from in the top right settings.

Will probably add a "randomize instrument used per round" setting or something to really dial it in. I added a randomize velocity option but didn't test it much