top | item 47018835

(no title)

ithkuil | 15 days ago

I personally found out that knowing how to use ai coding assistants productively is a skill like any other and a) it requires a significant investment of time b) can be quite rewarding to learn just as any other skill c) might be useful now or in the future and d) doesn't negate the usefulness of any other skills acquired on the past nor diminishes the usefulness of learning new skills in the future

discuss

order

sidrag22|14 days ago

As much as i loved the relation of vibe coding to slots and their related flow states in this article, I also think what you are stating is the exact reason these tools are not the same as slots, the skill gap is there and its massive.

I think there are a ton of people just pulling the lever over and over, instead of stepping back and considering how they should pull the lever. When you step back and consider this, you are for sure going to end up falling deeper into the engineering, architecture realm. Ensuring that continually pulling the lever doesn't result in potential future headaches.

I think a ton of people in this community are struggling with the lose of flow state, and attempting to still somehow enter it through prompting. The game in my view has just changed, its more about just generating the code, and being thoughtful about what comes next, its rapid usage of a junior to design your system, and if you overdue the rapidness the junior will give you headaches.

skydhash|14 days ago

> I think there are a ton of people just pulling the lever over and over, instead of stepping back and considering how they should pull the lever

There are deeper considerations like why pull the lever, or is it the correct lever? So many api usages is either seeing someone using a forklift to go the gym (bypassing the point), using it to lift a cereal box (overpowered), or using it to do watchmaking (very much the wrong tool).

Programming languages are languages, yes. But we only use them for two reasons. They can be mapped down to hardware ISA and they’re human shaped. The computer doesn’t care about the wrong formula as long as they can compute it. So it falls on us to ensure that the correct formula is being computed. And a lot of AI proponents is trying to get rid of that part.

pipes|15 days ago

On the using AI assistants I find that everything is moving so fast that I feel constantly like "I'm doing this wrong". Is the answer simply "dedicate time to experimenting? I keep hearing "spec driven design" or "Ralph" maybe I should learn those? Genuine thoughts and questions btw.

gnatolf|15 days ago

More specifically regarding spec-driven development:

There's a good reason that most successful examples of those tools like openspec are to-do apps etc. As soon as the project grows to 'relevant' size of complexity, maintaining specs is just as hard as whatever other methodology offers. Also from my brief attempts - similar to human based coding, we actually do quite well with incomplete specs. So do agents, but they'll shrug at all the implicit things much more than humans do. So you'll see more flip-flopped things you did not specify, and if you nail everything down hard, the specs get unwieldy - large and overly detailed.

gnatolf|15 days ago

Everybody feels like this, and I think nobody stays ahead of the curve for a prolonged time. There's just too many wrinkles.

But also, you don't have to upgrade every iteration. I think it's absolutely worthwhile to step off the hamster wheel every now and then, just work with you head down for a while and come back after a few weeks. One notices that even though the world didn't stop spinning, you didn't get the whiplash of every rotation.

Our_Benefactors|14 days ago

I don’t think Ralph is worthwhile, at least the few times I’ve tried to set it up I spent more time fighting to get the configuration right than if I had simply run the prompt. Coworkers had similar experiences, it’s better to set a good allowlist for Claude.

bobthepanda|15 days ago

I think find what works for you, and everything else is kind of noise.

At the end of the day, it doesn’t matter if a cat is black or white so long as it catches mice.

——

Ive also found that picking something and learning about it helps me with mental models for picking up other paradigms later, similar to how learning Java doesn’t actually prevent you from say picking up Python or Javascript

isodev|15 days ago

The addictive nature of the technology persists though. So even if we say certain skills are required to use it, then also it must come with a warning label and avoided by people with addictive personalities/substance abuse issues etc.

mettamage|14 days ago

It's addictive because of a hypothesis I have about addiction. I have no data to back it up other than knowing a lot of addicted people and I have studied neuroscience, yet I still think there's something to it. It's definitely not fully true though.

Addiction occurs because as humans we bond with people but we also bond with things. It could be an activity, a subject, anything. We get addicted because we're bonded to it. Usually this happens because we're not in fertile grounds to bond with what we need to bond with (usually a good group of friends).

When I look at addicted people a lot of them bond with people that have not so great values (big house, fast cars, designer clothing, etc.), adopt those values themselves and get addicted to drugs. This drugs is usually supplied by the people they bond with. However, they bond with those people in the first place because of being aimless and receiving little guidance in their upbringing.

I'm just saying all that to make it more concrete with what I mean about "good people".

Back to LLMs. A lot of us are bonding with it, even if we still perceive it as an AI. We're bonding with it because when it comes to certain emotional needs, they're not being fulfilled. Enter a computer that will listen endlessly to you and is intellectually smarter than most humans, albeit it makes very very dumb mistakes at times (like ordering +1000 drinks when you ask for a few).

That's where we're at right now.

I've noticed I'm bonded with it.

Oh, and to some who feel this opinion is a bit strong, it is. But consider that we used to joke that "Google is your best friend" when it just came out and long thereafter. I think there's something to this take but it's not fully in the right direction I think.

imiric|15 days ago

> knowing how to use ai coding assistants productively is a skill like any other

No, it's different from other skills in several ways.

For one, the difficulty of this skill is largely overstated. All it requires is basic natural language reading and writing, the ability to organize work and issue clear instructions, and some relatively simple technical knowledge about managing context effectively, knowing which tool to use for which task, and other minor details. This pales in comparison with the difficulty of learning a programming language and classical programming. After all, the entire point of these tools is to lower the required skill ceiling of tasks that were previously inaccessible to many people. The fact that millions of people are now using them, with varying degrees of success for various reasons, is a testament of this.

I would argue that the results depend far more on the user's familiarity with the domain than their skill level. Domain experts know how to ask the right questions, provide useful guidance, and can tell when the output is of poor quality or inaccurate. No amount of technical expertise will help you make these judgments if you're not familiar with the domain to begin with, which can only lead to poor results.

> might be useful now or in the future

How will this skill be useful in the future? Isn't the goal of the companies producing these tools to make them accessible to as many people as possible? If the technology continues to improve, won't it become easier to use, and be able to produce better output with less guidance?

It's amusing to me that people think this technology is another layer of abstraction, and that they can focus on "important" things while the machine works on the tedious details. Don't you see that this is simply a transition period, and that whatever work you're doing now, could eventually be done better/faster/cheaper by the same technology? The goal is to replace all cognitive work. Just because this is not entirely possible today, doesn't mean that it won't be tomorrow.

I'm of the opinion that this goal is unachievable with the current tech generation, and that the bubble will burst soon unless another breakthrough is reached. In the meantime, your own skills will continue to atrophy the more you rely on this tech, instead of on your own intellect.

ithkuil|13 days ago

> The fact that millions of people are now using them, with varying degrees of success for various reasons, is a testament of this.

I do agree with you that by design this new tool lowers the bar to entry etc.

But I just want to state the obvious: billions of kids are playing with a ball; it's not that hard. Yet far less people are good soccer players.

> The goal is to replace all cognitive work. Just because this is not entirely possible today, doesn't mean that it won't be tomorrow. > [..] > I'm of the opinion that this goal is unachievable with the current tech generatiom > [..] > In the meantime, your own skills will continue to atrophy the more you rely on this tech [..]

Here I don't quite follow.

I agree that if this tech is ready to completely replace you, you won't need to use your brain. But provided it is not there yet (like, at all), your intellect is needed quite a lot to get out of it anything more than toys.

The question is: do you benefit from using it or not? can you build faster or better by applying these tools in the appropriate way or should you just ignore it and keep doing things the way things used to be done up until a few months ago?

This is a legit question.

My point is: in order to anwswer this question I cannot base my intuition only on some vague first principles on what this tech stack ought to be able to do, or what other people say it's able to do, or what I suspect it will never be able to do: I need to touch it, to learn how to use it, just like every other tool. That's the only way I can truly get a sensible answer. And like any other skill, I'm fully aware that I can't devote just a few minutes trying it out and then reaching any conclusion.

EDIT: I do share a general concern about how new generations are going to achieve the full-picture understanding if they get exposed to these tools as the main approach towards software production. I come to this after a long career in system programming, so I don't personally see this as a threat to atrophy my own skills; but I do share a quite undefined sense of concern about where this is going

Our_Benefactors|14 days ago

> In the meantime, your own skills will continue to atrophy the more you rely on this tech, instead of on your own intellect

You’re right. I’m going back to writing assembly. These compilers have totally atrophied my ability to write machine code!