top | item 44788787

(no title)

didericis | 6 months ago

> Using AI tooling means, at least in part, betting on the future.

It means betting on a particular LLM centric vision of the future.

I’m still agnostic on that. I think LLMs allow for the creation of a lot of one off scripts and things for people that wouldn’t otherwise be coding, but I have yet to be convinced that more AI usage in a sufficiently senior software development team is more valuable than the traditional way of doing things.

I think there’s a fundamental necessity for a human to articulate what a given piece of software should do with a high level of specificity that can’t ever be avoided. The best you can do is piggy back off of higher level language and abstractions that guess what the specifics should be, but I don’t think it’s realistic to think all combinations of all business logic and ui can be boiled down to common patterns that an LLM could infer. And even if that were true, people get bored/like novelty enough that they’ll always want new human created stuff to shove into the training set.

discuss

order

danielmarkbruce|6 months ago

An LLM is not a tool to allow you to add a layer of abstraction. It's a worker.

didericis|6 months ago

It works by translating language abstractions to code.

The probability of plain english being correctly translated to code depends on existing code and documented abstractions describing lower level functionality.