top | item 43999614

(no title)

aquafox | 9 months ago

Until there is a radically new version of {popular programming language} with breaking changes and no new and correct answers to train on.

discuss

order

1123581321|9 months ago

These models can figure out syntax and language features they haven’t seen before. Try it with a few code snippets of your own made-up language. It’s a little freaky.

zahlman|9 months ago

They can implicitly assume that your made-up language is designed to be easy to use by native language speakers, and thus apply their existing understanding of "code" to it, sure.

int_19h|9 months ago

They can indeed, but 1) this takes up an inordinate amount of context, and 2) the more you force model to think about that, the less effective it is at actually writing code.