It’s called emergent behavior. We understand how an llm works, but do not have even a theory about how the behavior emerges from among the math. We understand ants pretty well, but how exactly does anthill behavior come from ant behavior? It’s a tricky problem in system engineering where predicting emergent behavior (such as emergencies) would be lovely.
fc417fc802|4 days ago
Actually we have an awful lot of those.
I'm not sure if emergent is quite the right term here. We carefully craft a scenario to produce a usable gradient for a black box optimizer. We fully expect nontrivial predictions of future state to result in increasingly rich world models out of necessity.
It gets back to the age old observation about any sufficiently accurate model being of equal complexity as the system it models. "Predict the next word" is but a single example of the general principle at play.
hnfong|3 days ago
This is admission we don't know how it emerges.
Sure, we expect the behavior to emerge, but we don't know how.
netfortius|4 days ago
[1] https://en.wikipedia.org/wiki/What_Is_It_Like_to_Be_a_Bat%3F
devmor|4 days ago
themafia|4 days ago
We fully do. There is a significant quality difference between English language output and other languages which lends a huge hint as to what is actually happening behind the scenes.
> but how exactly does anthill behavior come from ant behavior?
You can't smell what ants can. If you did I'm sure it would be evident.
spiralcoaster|4 days ago
1. Can you reveal "what's actually happening behind the scenes" beyond the hint you gave? I can't figure it out.
2. Can you explain how an ants sense of smell leads to anthills?
kristiandupont|4 days ago
canjobear|4 days ago
?