top | item 47145834

(no title)

blindhippo | 6 days ago

I've been using these tools for nearly a year and half on a daily basis. They've become an integral part of my tool box for solving problems.

But writing code was never much more than 35-40% of my job while working for companies/others. Most my time has always gone towards communication, design, and validation. All three of those are not particularly vulnerable to mass AI automation except for the most trivial of scenarios and I have not seen evidence that has changed in over 2 years of so called "improvements".

My "exit plan" ultimately is to be one of the engineers capable of using these tools to scale my impact accordingly so I can focus on higher order problem solving, which ultimately is what is most valuable. I would be more concerned if I was in marketing/sales or frankly middle management.

Maybe this is just "copium" on my part, who knows, this sector is moving fast.

discuss

order

mekael|6 days ago

I was discussing this with one of my counterparts today, and we both agreed that writing the code is the easy part, and actually knowing what to do / getting requirements sorted out / working with our compliance team are the hard parts. That being said, I'm in a pretty highly regulated industry so it just might be that.

df2dd|6 days ago

Envisioning what should exist is always the hardest part.

If LLMs provide substantial material to be able to produce what one envisions faster, that is great. But LLMs will not be doing the envisioning. Most humans already are poor at that. Hence why there are very few real 'visionaries' in history.

Envisioning always requires deep thinking. If LLMs eat away at a humans ability to sit and think, this will make envisioning solutions harder. So you'll see more stuff produced, but largely more crap.