top | item 33179285

(no title)

dimmuborgir | 3 years ago

Those models are not trained on short loops. They are trained on whole songs just like image generation models are trained on whole images. And yet they struggle to repeat sections, modulate to a different key, create bridges, intros and outros. After a few seconds of hallucinating a melodic line they simply abandon the idea and migrate to another one. There is no global structure whatsoever.

discuss

order

efishnc|3 years ago

Maybe that's the problem.

We're trying to train a full composer AI without allowing to learn about different instrument sections independently at first. The human composer will have a good idea of the different parts and know how to merge them in harmony.

I think we might get better results training separate AI systems on percussions, strings, vocals etc. then somehow create connections between them so they learn together. A band AI if you will.

We could try a BERT for each, with the generator learning to output logical sequences of sounds instead of words.

yeasurebut|3 years ago

Musicians don’t spit out an album in one sitting and they’re highly trained in theory. They get bored and tired of a process and take breaks. They come up with an album of loops composed together over time.

AIs state will forever be constrained to the limits of human cognition and behavior as that’s what it’s trained on.

I read published research all year. Circular reasoning. Tautology. It’s all over PhD thesis.

There’s no “global structure” to humanity. Relativity is a bitch.

Seeing the world through the vacuum of embedded inner monologue ignores the constraints of the physical one. It’s exhausting dealing with the mentality some clean room idea we imagine in a hammock can actually exist in a universe being ripped asunder by entropy.

It’s living in memory of what we were sold; some ideal state. Very akin to religious and nation state idealism.

mjburgess|3 years ago

I think it's deeply depressing that AI has been sold as something even capable of modelling anything humans do; and quite depressing that this comment exists.

"AI" is just taking `mean()` over our choice of encodings of our choice of measurements of our selection of things we've created.

There is as much "alike humans" in patterns in tree bark.

AI is an embarrassingly dumb procedure, incapable of the most basic homology with anything any animal has ever done; us especially.

We are embedded in our environments, on which we act, and which act on us. In doing so we physically grow, mould our structure and that of our environment, and develop sensory-motor conceptualisations of the world. Everything we do, every act of the imagination or of movement of our limbs, is preconditioned-on and symptomatic-of our profound understanding of the world and how we are in it.

The idea that `mean(424,34324,223123,3424,....)` even has any revelance to us at all is quite absurd. The idea that such a thing might sound pleasant thru' a speaker, irrelevant.

This is a product of i dont know what. On the optimist side, a cultish desire to see Science produce a new utopia. On the pessimisst side, a likewise delusional desire to see Humans as dumb machines.

What a sad state!