top | item 44660658

(no title)

taylorallred | 7 months ago

One thing that has always worried me about AI coding is the loss of practice. To me, writing the code by hand (including the boilerplate and things I've done hundreds of times) is the equivalent of Mr. Miyagi's paint-the-fence. Each iteration gets it deeper into your brain and having these patterns as a part of you makes you much more effective at making higher-level design decisions.

discuss

order

biophysboy|7 months ago

A retort you often hear is that prior technologies, like writing or the printing press, may have stunted our calligraphy or rhetorical skills, but they did not stunt our capacity to think. If anything, they magnified it! Basically, the whole Steve Jobs' bicycle-for-the-mind idea.

My issue with applying this reasoning to AI is that prior technologies addressed bottlenecks in distribution, whereas this more directly attacks the creative process itself. Stratechery has a great post on this, where he argues that AI is attempting to remove the "substantiation" bottleneck in idea generation.

Doing this for creative tasks is fine ONLY IF it does not inhibit your own creative development. Humans only have so much self-control/self-awareness

arscan|7 months ago

I’ve been thinking of LLMs a bit like a credit-card-for-the-mind, it reduces friction to accessing and enabling your own expertise. But if you don’t have that expertise already, be careful, eventually it’ll catch up to you and a big bill will be due.

margalabargala|7 months ago

I still don't think that's true. It's just the medium that changes here.

A better analogy than the printing press, would be synthesizers. Did their existence kill classical music? Does modern electronic music have less creativity put into it than pre-synth music? Or did it simply open up a new world for more people to express their creativity in new and different ways?

"Code" isn't the form our thinking must take. To say that we all will stunt our thinking by using natural language to write code, is to say we already stunted our thinking by using code and compilers to write assembly.

cess11|7 months ago

Bad examples. Computer keyboards killed handwriting, the Internet killed rhetoric.

yoyohello13|7 months ago

There are many time when I’ll mull over a problem in my head at night or in the shower. I kind of “write the code” in my head. I find it very useful sometimes. I don’t think it would be possible if I didn’t have the language constructs ingrained in my head.

Jonovono|7 months ago

I find it do this more now with AI than before.

donsupreme|7 months ago

Many analog to this IRL:

1) I can't remember the last time I write something meaningfully long with an actual pen/pencil. My handwriting is beyond horrible.

2) I can't no longer find my way driving without a GPS. Reading a map? lol

lucianbr|7 months ago

If you were a professional writer or driver, it might make sense to be able to do those things. You could still do without them, but they might make you better in your trade. For example, I sometimes drive with GPS on in areas I know very well, and the computer provided guidance is not the best.

0x457|7 months ago

> I can't remember the last time I write something meaningfully long with an actual pen/pencil. My handwriting is beyond horrible.

That's a skill that depends on motor functions of your hands, so it makes sense that it degrades with lack of practice.

> I can't no longer find my way driving without a GPS. Reading a map? lol

Pretty sure what that actually means in most cases is "I can go from A to B without GPS, but the route will be suboptimal, and I will have to keep more attention to street names"

If you ever had a joy of printing map quest or using a paper map, I'm sure you still these people skill can do, maybe it will take them longer. I'm good at reading mall maps tho.

danphilibin|7 months ago

On 2) I've combatted this since long before AI by playing a game of "get home without using GPS" whenever I drive somewhere. I've definitely maintained a very good directional sense by doing this - it forces you to think about main roads, landmarks, and cardinal directions.

goda90|7 months ago

I don't like having location turned on on my phone, so it's a big motivator to see if I can look at the map and determine where I need to go in relation to familiar streets and landmarks. It's definitely not "figure out a road trip with just a paper map" level wayfinding, but it helps for learning local stuff.

stronglikedan|7 months ago

I couldn't imagine operating without a paper and pen. I've used just about every note taking app available, but nothing commits anything to memory like writing it down. Of course, important writings go into the note app, but I save time inputting now and searching later if I've written things down first.

eastbound|7 months ago

> find my way driving without a GPS. Reading a map? lol

Most people would still be able to. But we fantasize about the usefulness of maps. I remember myself on the Paris circular highway (at the time 110km/h, not 50km/h like today), the map on the driving wheel, super dangerous. You say you’d miss GPS features on a paper map, but back then we had the same problems: It didn’t speak, didn’t have the blinking position, didn’t tell you which lane to take, it simplified details to the point of losing you…

You won’t become less clever with AI: You already have Youtube for that. You’ll just become augmented.

okr|7 months ago

Soldering transistors by hand was a thing too, once. But these days, i am not sure, if people wanna keep up anymore. Many trillions of transistors later. :)

I like this zooming in and zooming out, mentally. At some point i can zoom out another level. I miss coding. While i still code a lot.

cmiles74|7 months ago

I think this is a fundamentally different pursuit. The intellectual part was figuring out where the transistors would go, that's the part that took the thinking. Letting a machine do it just let's you test quicker and move onto the next step. Although, of course, if you only solder your transistors by hand once a year you aren't likely to be very good at it. ;-)

People say the same thing about code but there's been a big conflation between "writing code" and "thinking about the problem". Way too often people are trying to get AI to "think about the problem" instead of simply writing the code.

For me, personally, the writing the code part goes pretty quick. I'm not convinced that's my bottleneck.

lucianbr|7 months ago

There are definitely people who solder transistors by hand still. Though most not for a living. I wonder how the venn diagram looks together with the set of people designing circuits that eventually get built by machines. Maybe not as disjoint as you first imagine.

Ekaros|7 months ago

If you start designing circuits with LLM (can they even do that yet?) Will you ever learn to do it yourself or fix it when it goes wrong and magic smoke comes out after robot made it for you?

ozten|7 months ago

"Every augmentation is an amputation" -- Marshall McLuhan

danielvaughn|7 months ago

Well there goes a quote that will be stuck in my head for the rest of my life.

jxf|7 months ago

Q: Where did he say this? I think this may be apocryphal (or a paraphrasing?) as I couldn't find a direct quote.

add-sub-mul-div|7 months ago

Generalize this to: what's it going to look like in ten years when the majority of our society has outsourced general thinking and creativity rather than practicing it?

sim7c00|7 months ago

i already see only electric bikes and chatGPT answers from ppl perpetually glued to their phone screens... soon no one can walk and everyone has a red and green button on their toilet-tv-lounge-chair watching the latest episode of Ow my b**! ;D

szundi|7 months ago

[deleted]

beefnugs|7 months ago

They want you to become an expert at the new thing: knowing how to set up the context with perfect information. Which is arguably as much if not more work than just programming the damn thing.

Which theoretically could actually be a benefit someday: if your company does many similar customer deployments, you will eventually be more efficient. But if you are doing custom code meant just for your company... there may never be efficiency increase

dimal|7 months ago

I still do a lot of refactoring by hand. With vim bindings it’s often quicker than trying to explain to a clumsy LLM how to do it.

For me, refactoring is really the essence of coding. Getting the initial version of a solution that barely works —- that’s necessary but less interesting to me. What’s interesting is the process of shaping that v1 into something that’s elegant and fits into the existing architecture. Sanding down the rough edges, reducing misfit, etc. It’s often too nitpicky for an LLM to get right.

skydhash|7 months ago

There are lots of project templates and generators that will get you close to where you can start writing business code and not just boilerplate.

cwnyth|7 months ago

My LLM-generated code has so many bugs in it, that I end up knowing it better since I have to spend more time debugging/figuring out small errors. This might even be better: you learn something more thoroughly when you not only practice the right answers, but know how to fix the wrong answers.

bluefirebrand|7 months ago

That is absurd

If you write it by hand you don't need to "learn it thoroughly", you wrote it

There is no way you understand code between by reading it than by creating it. Creating it is how you prove you understand it!

vlod|7 months ago

For me the process of figuring out wtf I need to do and how I'm going to do it is my learning process.

For beginners my I think this is a very important step in learning how to break down problems (into smaller components) and iterating.

jgb1984|7 months ago

What worries me more is the steep decline in code quality. The python and javascript output I've seen the supposed best LLM's generate is inefficient, overly verbose and needlessly commented at best, and simply full of bugs at worst. In the best case they're glaringly obvious bugs, in the worst case they're subtle ones that will wreak havoc for a long time before they're eventually discovered, but by then the grasp of the developers on the codebase will have slipped away far enough to prevent them from being compete t enough to solve the bugs.

There is no doubt in my mind that software quality has taken a nosedive everywhere AI has been introduced. Our entire industry is hallucinating its way into a bottomless pit.

rossant|7 months ago

I'm very cautious about using LLM-generated code in production, but for one-off throwaway scripts that generate output I can manually verify, LLMs are a huge time saver.

ge96|7 months ago

Tangent, there was this obnoxious effect for typing in editors the characters would explode, makes me think of a typewriter as you're banging away every character for some piece of code.

I imagine people can start making code (probably already are) where functions/modules are just boxes as a UI and the code is not visible, test it with in/out, join it to something else.

When I'm tasked to make some CRUD UI I plan out the chunks of work to be done in order and I already feel the rote-ness of it, doing it over and over. I guess that is where AI can come in.

But I do enjoy the process of making something even like a POSh camera GUI/OS by hand..

lupire|7 months ago

Do you write a lot assembler, to make your note effective at higher level design?

taylorallred|7 months ago

Writing a lot of assembler would certainly make me more effective at designing systems such as compilers and operating systems. As it stands, I do not work on those things currently. They say you should become familiar with at least one layer of abstraction lower than where you are currently working.

segmondy|7 months ago

Doesn't worry me. I believed AI would replace developers and I still do to some degree. But AI is going to lack context, not just in business domain but how it would intersect with the tech side. Experienced developers will be needed. The vibe coders are going to get worse and will need experienced developers to come fix the mess. So no worries, the only thing that would suck would be if the vibe coders earn more money and experienced hand crafting devs are left to pick up the crumbs to survive.

tjr|7 months ago

I'm concerned about this also. Even just reading about AI coding, I can almost feel my programming skills start to atrophy.

If AI tools continue to improve, there will be less and less need for humans to write code. But -- perhaps depending on the application -- I think there will still be need to review code, and thus still need to understand how to write code, even if you aren't doing the writing yourself.

I imagine the only way we will retain these skills is be deliberately choosing to do so. Perhaps not unlike choosing to read books even if not required to do so, or choosing to exercise even if not required to do so.

lucianbr|7 months ago

How could advances in programming languages still happen when nobody is writing code anymore? You think we will just ask the AI to propose improvements, then evaluate them, and if they are good ask the AI to make training samples for the next AI?

Maybe, but I don't think it's that easy.

sandeepkd|7 months ago

Along the same lines, its probably little more than that. When it comes to software development, every iteration of execution/design is supposedly either faster or better based on the prior learnings for things that you have done by urself or observed very carefully.

dfee|7 months ago

I’m concerned about becoming over reliant on GPT for code reviews for this reason (as I learn Rust).

marcosdumay|7 months ago

My practice in writing assembly is so lost by now that it's not much different than if I never learned it. Yet, it's not really a problem.

What is different about LLM-created code is that compilers work. Reliably and universally. I can just outsource the job of writing the assembly to them and don't need to think about it again. (That is, unless you are in one of those niches that require hyper-optimized software. Compilers can't reliably give you that last 2x speed-up.)

LLMs by their turn will never be reliable. Their entire goal is opposite to reliability. IMO, the losses are still way higher than the gains, and it's questionable if this is an architectural premise that will never change.

ethan_smith|7 months ago

The "paint-the-fence" analogy is spot-on, but AI can be the spotter rather than replacement - use it for scaffolding while deliberately practicing core patterns that strengthen your mental models.

wussboy|7 months ago

I suspect when it comes to human mastery there is no clear dividing line between scaffolding and core, and that both are important.

giancarlostoro|7 months ago

As long as you understand the scaffolding and its implications, I think this is fine. Using AI for scaffolding has been the key thing for me. If I have some obscure idea I want to build up using Django, I braindump to the model what I want to build, and it spits out models, and what not.

Course, then there's lovable, which spits out the front-end I describe, which it is very impressively good at. I just want a starting point, then I get going, if I get stuck I'll ask clarifying questions. For side projects where I have limited time, LLMs are perfect for me.

Lerc|7 months ago

I don't get this with boilerplate. To me boilerplate code is the code that you have to write to satisfy some predefined conditions that has little to do with the semantics of the code I am actually writing. I'm fine with AI writing this stuff for me if it does it reliably, or if the scale is small enough that I can easily spot and fix the errors. I don't see that aspect of coding to be much more than typing.

On the other hand I do a lot more fundamental coding than the median. I do quite a few game jams, and I am frequently the only one in the room who is not using a game engine.

Doing things like this I have written so many GUI toolkits from scratch now that It's easy enough for me to make something anew in the middle of a jam.

For example https://nws92.itch.io/dodgy-rocket In my experience it would have been much harder to figure out how to style scrollbars to be transparent with in-theme markings using an existing toolkit than writing a toolkit from scratch. This of course changes as soon as you need a text entry field. I have made those as well, but they are subtle and quick to anger.

I do physics engines the same way, predominantly 2d, (I did a 3d physics game in a jam once but it has since departed to the Flash afterlife). They are one of those things that seem magical until you've done it a few times, then seem remarkably simple. I believe John Carmack experienced that with writing 3d engines where he once mentioned quickly writing several engines from scratch to test out some speculative ideas.

I'm not sure if AI presents an inhibiter here any more than using an engine or a framework. They both put some distance between the programmer and the result, and as a consequence the programmer starts thinking in terms of the interface through which they communicate instead of how the result is achieved.

On the other hand I am currently using AI to help me write a DMA chaining process. I initially got the AI to write the entire thing. The final code will use none of that emitted output, but it was sufficient for me to see what actually needed to be done. I'm not sure if I could have done this on my own, AI certainly couldn't have done it on it's own. Now that I have (almost (I hope)) done it once in collaboration with AI, I think I could now write it from scratch myself should I need to do it again.

I think AI, Game Engines, and Frameworks all work against you if you are trying to do something abnormal. I'm a little amazed that Monument Valley got made using an engine. I feel like they must have fought the geometry all the way.

I think this jam game I made https://lerc.itch.io/gyralight would be a nightmare to try and implement in an engine. Similarly I'm not sure if an AI would manage the idea of what is happening here.