top | item 35884384

(no title)

callesgg | 2 years ago

True, LLMs definetly has something that is "thought-like".

But todays networks lacks the recursion(feedback where the output can go directly to the input) that is needed for the type of internalized thoughts that humans have. I guess this is one thing you are pointing at by mentioning the continuousnes of the internals of LLMs.

discuss

order

No comments yet.