top | item 43051026

(no title)

Euphorbium | 1 year ago

Smells exactly like llm created solution.

discuss

order

monocasa|1 year ago

Or just what happens when you hire a bunch of 20 year olds and let them loose.

That's currently how I model my usage of LLMs in code. A smart veeeery junior engineer that needs to be kept on a veeeeery short leash.

ellisv|1 year ago

Yes. LLMs are very much like a smart intern you hired with no real experience who is very eager to please you.

NewJazz|1 year ago

Even at 20 years old I would not have done this.

gvx|1 year ago

One who thinks "open source" means blindly copy/pasting code snippets found online.

daveguy|1 year ago

It's definitely both. A bunch of 20 year olds were let loose to be "super efficient." So, to be efficient they use LLMs to implement what should be a major government oversight webpage. Even after the fix the list is a few half-baked partial document excerpts with a few sentences saying, "look how great we are!" It's embarrassing.

Maxatar|1 year ago

Does it? At least my experience is that ChatGPT goes super hard on security, heavily promoting the use of best practices.

Maybe they used Grok ;P

tatersolid|1 year ago

> At least my experience is that ChatGPT goes super hard on security, heavily promoting the use of best practices.

Not my experience at all. Every LLM produces lots of trivial SQLI/XSS/other-injection vulnerabilities. Worse they seem to completely authorization business logic, error handling, and logging even when prompted to do so.

zamalek|1 year ago

Does it, though? The saying says we shouldn't mistake incompetence for malice, but that requires more than usual for Musk's retinue.

Smells like getting a backdoor in early.

daveguy|1 year ago

Apparently they get backdoors in as incompetently as they create efficiency.

rcpt|1 year ago

Maybe doge should have used an LLM to generate defenses

caboteria|1 year ago

They did, and this is what they got.