These models can figure out syntax and language features they haven’t seen before. Try it with a few code snippets of your own made-up language. It’s a little freaky.
They can implicitly assume that your made-up language is designed to be easy to use by native language speakers, and thus apply their existing understanding of "code" to it, sure.
They can indeed, but 1) this takes up an inordinate amount of context, and 2) the more you force model to think about that, the less effective it is at actually writing code.
1123581321|9 months ago
zahlman|9 months ago
int_19h|9 months ago