top | item 47167130

(no title)

athorax | 4 days ago

Why do you think there is a lot of training data? Could it be because it's stable and virtually unchanged for decades? Hmmm.

discuss

order

esafak|3 days ago

Because bash is everywhere. Stability is a separate concern. And we know this because LLMs routinely generate deprecated code for libraries that change a lot.