(no title)
moomin
|
15 days ago
Ironically, if you actually know what you’re doing with an LLM, getting a separate process to check the quotations are accurate isn’t even that hard. Not 100% foolproof, because LLM, but way better than the current process of asking ChatGPT to write something for you and then never reading it before publication.
Springtime|15 days ago
This is what the author actually speculated may have occurred with Ars. Clearly something was lacking in the editorial process though that such things weren't human verified either way.