Isn't this just a form of next token prediction? i.e. you'll keep your options open for a potential rhyme if you select words that have many associated rhyming pairs, and you'll further keep your options open if you focus on broad topics over niche
Assuming the task remains just generating tokens, what sort of reasoning or planning would say is the threshold, before it's no longer "just a form of next token prediction?"
This is an interesting question, but it seems at least possible that as long as the fundamental operation is simply "generate tokens", that it can't go beyond being just a form of next-token prediction. I don't think people were thinking of human thought as a stream of tokens until LLMs came along. This isn't a very well-formed idea, but we may require an AI for which "generating tokens" is just one subsystem of a larger system, rather than the only form of output and interaction.
I'm not sure if this is a meaningful distinction: Fundamentally you can describe the world as a "next token predictor". Just treat the world als a simulator with a time step of some quantum of time.
That _probably_ won't capture everything, but for all practical purposes it's non-distinguishable from reality (yes, yes, time is not some constant everywhere)
Yeah, I'd agree that for that model (certainly not AGI) it's just an extension/refinement of next token prediction.
But when we get a big aggregated of all of these little rules and quirks and improvements and subsystems for triggering different behaviours and processes - isn't that all humans are?
I don't think it'll happen for a long ass time, but I'm not one of those individuals who, for some reason, desperately want to believe that humans are special, that we're some magical thing that's unexplainable or can't be recreated.
It doesn't really make explain it because then you'd expect lots of nonsensical lines trying to make a sentence that fits with the theme and rhymes at the same time.
I will feel and itch and subconsciously scratch it, especially if I'm concentrating on something. That's an subsystem independent of conscious thought.
I suppose it does make sense - that our early evolution consisted of a bunch of small, specific background processes that enables an individual's life to continue; a single celled organism doesn't have neurons but exactly these processes - chemical reactions that keep it "alive".
Then I imagine that some of these processes became complex enough that they needed to be represented by some form of logic, hence evolving neurons.
Subsequently, organisms comprised of many thousands or more of such neuronal subsystems developed higher order subsystems to be able to control/trigger those subsystems based on more advanced stimuli or combinations thereof.
And finally us. I imagine the next step, evolution found that consciousness/intelligence, an overall direction of the efforts of all of these subsystems (still not all consciously controlled) and therefore direction of an individual was much more effective; anticipation, planning and other behaviours of the highest order.
I wouldn't be surprised if, given enough time and the right conditions, that sustained evolution would result in any or most creatures on this planet evolving a conscious brain - I suppose we were just lucky.
DennisP|11 months ago
Vegenoid|11 months ago
hnaccount_rng|11 months ago
That _probably_ won't capture everything, but for all practical purposes it's non-distinguishable from reality (yes, yes, time is not some constant everywhere)
fennecfoxy|10 months ago
But when we get a big aggregated of all of these little rules and quirks and improvements and subsystems for triggering different behaviours and processes - isn't that all humans are?
I don't think it'll happen for a long ass time, but I'm not one of those individuals who, for some reason, desperately want to believe that humans are special, that we're some magical thing that's unexplainable or can't be recreated.
pertymcpert|11 months ago
throwuxiytayq|11 months ago
fennecfoxy|10 months ago
I will feel and itch and subconsciously scratch it, especially if I'm concentrating on something. That's an subsystem independent of conscious thought.
I suppose it does make sense - that our early evolution consisted of a bunch of small, specific background processes that enables an individual's life to continue; a single celled organism doesn't have neurons but exactly these processes - chemical reactions that keep it "alive".
Then I imagine that some of these processes became complex enough that they needed to be represented by some form of logic, hence evolving neurons.
Subsequently, organisms comprised of many thousands or more of such neuronal subsystems developed higher order subsystems to be able to control/trigger those subsystems based on more advanced stimuli or combinations thereof.
And finally us. I imagine the next step, evolution found that consciousness/intelligence, an overall direction of the efforts of all of these subsystems (still not all consciously controlled) and therefore direction of an individual was much more effective; anticipation, planning and other behaviours of the highest order.
I wouldn't be surprised if, given enough time and the right conditions, that sustained evolution would result in any or most creatures on this planet evolving a conscious brain - I suppose we were just lucky.
alfalfasprout|11 months ago
rcrsvpreordnmnt|11 months ago