top | item 47169208

(no title)

jama211 | 3 days ago

The hallucination problem has dropped exponentially in recent times in code generation. I can’t even recall a time any of the modern models I’ve used have done it in my recent usage. It’ll still do it in cheap/fast models and in places outside of code generation, but the good models write frankly incredible code, especially if you set them up with feedback loops.

discuss

order

No comments yet.