top | item 45718987

(no title)

monch1962 | 4 months ago

Old fart here...

I started coding in the 70s, loved it then, still love it now and LOVING the emergence of Gen AI tools.

For perspective, the IT industry went through a similar change with the emergence of search engines ~30 years ago. At that time, a big part of the value of a software "expert" was in their ability to remember and recall lots of info (most of it of dubious value, to be fair). These experts usually had shelves of well-thumbed books on all sorts of topics, and could recall obscure info from these books seemingly at will. With the emergence of AskJeeves, AltaVista and eventually Google, suddenly nobody needed to remember anything OR even know where to find it - with a simple search, you could get nearly all the info you needed.

I can still remember the panicked response to this brutal change from the senior IT people I worked with at the time...

Did the demand for skilled developers dry up? No

Nor did it end with

- introduction of COBOL (designed so that non-coders could write code),

- PCs (surely leading to the end of systems programming as a career),

- spreadsheets (so accountants no longer needed programmers),

- 4GLs (designed to greatly simplify coding; report writing in particular),

- Visual BASIC (so the world would no longer need C programmers; anyone could learn to write BASIC),

- Microsoft SQL Server (nobody would need mainframe databases any more, so all those mainframe jobs would disappear)

- object oriented coding (all those code reuse possibilities! Very quickly programming should devolve to just glueing together other peoples' code),

- open source (because inevitably any tool of value would soon have a competitor that was free, destroying the value proposition of companies that wrote software to sell),

- Linux (how could Windows compete with free? Shed a tear for all those soon-to-be-unemployed Windows experts)

- NoSQL (because the need for "legacy" databases like Oracle, DB2, Postgres, MySQL etc. would surely go away) - etc., etc., etc.

The reality is that you still need a grounding in software development to do coding well, even with AIs. I'm absolutely loving how quickly I can create solid code with the assistance of Gen AI - lots of tasks that used to take me a week I can now knock over in a few hours.

I also notice how many people are struggling with how to use Gen AI tools for coding tasks - my take is there's 2 distinct skills you need: knowledge of how to do software development well, and knowledge of how to use Gen AI tools for coding. Having the first doesn't automatically lead to the 2nd - you have to put in the time to learn about Gen AI, THEN work out how to fit Gen AI tools around your current workflow, THEN work out how to optimise the way you work with your new idiot savant buddy that has perfect recall.

That whole process (new tool appears -> learn about it -> work out how to fit it into my current workflow -> optimise my workflow) has basically been my entire career in a nutshell.

People have been predicting the demise of programmers for my entire career (40+ years now), and so far they've been wrong every time. For each new disruption that appears, the key has been to embrace it and adapt how you work accordingly.

Gen AI may indeed be different and kill off all programming careers overnight, but so far I'm not seeing it

discuss

order

imiric|4 months ago

> Gen AI may indeed be different and kill off all programming careers overnight, but so far I'm not seeing it

Well, we're only ~5 years into the current hype cycle, so it's difficult to predict the long-term impact of the technology.

That said, I do think it is substantially different from the examples you mentioned.

For one, it is generally applicable. It's not an iterative or generational improvement over what came before—it is a paradigm shift in many ways: how software is produced, by whom, the quality of the product, the time, effort, and cost needed to produce it, etc.

Secondly, while it might not lead to the demise of all programming careers, and certainly not overnight, it will significantly impact the market value of traditional programming in the short-term, and, like any new technology, it will also open doors to new careers and specialization for humans. We're seeing this play out today.

But there are a few problems with this:

- Since software is turned into a commodity and the skills and resources required to produce it are much lower, there will be a flood of poorly made software, and the average quality will go down. Picture SEO scams and spam dialed up to 11, and encompassing every part of our existence, not just on the web.

- Those new careers for humans are highly specialized. All jobs will essentially involve being an assistant to the "AI", and specializing in related technologies. A "systems engineer", "frontend developer", "designer", "data analyst", etc., will all boil down to a role revolving around "AI" instead. People who don't like this type of work? Tough luck. Go sell your artisanally made programs to the niche group of people crazy enough to care about it, and good luck making a living out of it.

- Those new careers for humans are only temporary. Once "AI" gets capable enough to require less manual steering and intervention from humans, the market value for that type of work will collapse as well. The only human jobs then will be to actually create "AI". And once "AI" is self-sufficient to improve itself, we get the singularity, and then pick your favorite sci-fi scenario from there. It's debatable whether this will come to pass, and whether we're on the right track for it with the current tech, but that's certainly the goal we're aiming for.

So, yeah, I don't buy the argument that this is the same as any other tech. It's much, much, different, and it's frankly troubling that it's getting downplayed as just another step on the technological ladder. The long-term impact of this is something that should concern us all, and the worst thing we can do is to give free reign to companies to decide that future for us.