top | item 45983618

(no title)

blubber | 3 months ago

Is this a problem for which the (human) solution is well documented an known and was learned during the training phase? Or is it a novel problem?

I personally think anthropomorphizing LLMs is a bad idea.

discuss

order

No comments yet.