We aren't expecting LLMs to come up with incredibly creative software designs right now, we are expecting them to execute conventional best practices based on common patterns. So it makes sense to me that it would not excel at the task that it was given here.
The whole thing seems like a pretty good example of collaboration between human and LLM tools.
I haven't actually had that much luck with having them output a boring API boilerplate in large Java projects. Like "I need to create a new BarOperation that has to go in a different set of classes and files and API prefixes than all the FooOperations and I don't feel like copy pasting all the yaml and Java classes" but the AI has problems following this. Maybe they work better in small projects.
I actually like LLMs better for creative thinking because they work like a very powerful search engine that can combine unrelated results and pull in adjacent material I would never personally think of.
> Like "I need to create a new BarOperation that has to go in a different set of classes and files and API prefixes than all the FooOperations and I don't feel like copy pasting all the yaml and Java classes" but the AI has problems following this.
Uh, no. I've seen the twitter posts saying llms will replace me. I've watched the youtube videos saying llms will code whole apps on one prompt, but are light on details or only show the most basic todo app from every tutorial.
We're being told that llms are now reasoning, which implies they can make logical leaps and employ creativity to solve problems.
The hype cycle is real and setting expectations that get higher with the less you know about how they work.
> The hype cycle is real and setting expectations that get higher with _the less you know about how they work_.
I imagine on HN, the expectations we're talking about are from fellow software developers who at least have a general idea on how LLM's work and their limitations.
writeslowly|9 months ago
I actually like LLMs better for creative thinking because they work like a very powerful search engine that can combine unrelated results and pull in adjacent material I would never personally think of.
coffeeismydrug|9 months ago
To be fair, I also have problems following this.
ehutch79|9 months ago
We're being told that llms are now reasoning, which implies they can make logical leaps and employ creativity to solve problems.
The hype cycle is real and setting expectations that get higher with the less you know about how they work.
prophesi|9 months ago
I imagine on HN, the expectations we're talking about are from fellow software developers who at least have a general idea on how LLM's work and their limitations.
bgwalter|9 months ago
Whenever I try some claim, it does not work. Yes, I know, o3 != CoPilot but I don't have $120 and 100 prompts to spend on making a point.
ldjkfkdsjnv|9 months ago