(no title)
mkleczek | 1 month ago
The problem with LLMs is that it is not only the "irrelevant details" that are hallucinated. It is also "very relevant details" which either make the whole system inconsistent or full of security vulnerabilities.
mkleczek | 1 month ago
The problem with LLMs is that it is not only the "irrelevant details" that are hallucinated. It is also "very relevant details" which either make the whole system inconsistent or full of security vulnerabilities.
fc417fc802|1 month ago
But if it's security critical? You'd better be touching every single line of code and you'd better fully understand what each one does, what could go wrong in the wild, how the approach taken compares to best practices, and how an attacker might go about trying to exploit what you've authored. Anything less is negligence on your part.