Rather than feeling scared I think you can use this to productively guide you.
Look through the prompts he is using. Do they strike you as something a random person can produce? Not really. They show that Simon has excellent understanding of sqlite and how to do benchmarks. All ChatGPT does for him is speed up the typing. My experience has been that the quality of output is heavily correlated with the quality of input. Good clear prompts give good output. I postulate that if Simon had worse understanding of benchmarks and sqlite he would not have gotten as good output.
If you as a developer make your money by doing what ChatGPT is doing (turning clear instructions into working code) then you are going to be automated away. If you as a developer make your money by having a good understanding of the tools you use and by communicating that understanding clearly, then you have to type less in the future.
The silver lining is that even without GPT good understanding of tools and clear communication were always the more important skills to learn. ChatGPT just solidifies existing structure.
For the CRUD app writers - yeah this is bad news. But it was always clear that it was a bubble. If you're actually solving new problems, it works more like code completion.
Unix was written in assembly with the Ed editor. Even Notepad and any modern language probably represents a 100x increase in productivity compared to that - much more than ChatGPT can do. The field of software engineering will only grow from this.
From what I've seen - most places have way more work than available programmers. Jira tickets stay unsolved for years. Maybe with the increased productivity we can actually clear our backlogs one day.
Not sure about you, but I also have a huge backlog of projects that don't exist yet and I would like to create them. but usually they're too complex to do over the weekend and I don't have that much time to spare.
I already tried to approach one of those projects but I quickly failed as gpt knowledge was too outdated for the library that I needed to use and I didn't figure out how to "patch" gpt knowledge.
in this case I'm not very familiar with the stack I needed for that project ( and I think future versions of LLM could get better at bridging that gap) but for tasks that I'm more familiar with I noticed a significant increase in productivity
I hope there’s still room for improvement. I haven’t been able to get ChatGPT to do anything coherent for stuff with > 5 functions or logic that is even the slightest complex. It likewise frequently stumbles/hallucinates when trying to integrate revisions. This is all using GPT-4 as well.
What’s it’s absolutely fantastic at is the small, one-off scripts that were tedious and time-consuming to write, integrating well-specified changes into existing code, tests, and boiler plate config stuff.
I know this is all “for now” talk, but it will be interesting to see how quickly or if at all it can get to production-ready code in the absence of pretty thorough review by someone who knows how stuff actually works. Natural language is a fantastic new interface, but (for now) you still have to know how to describe stuff that actually works.
My experience has been similar. Not to say it hasn't been helpful in 'larger' apps, but I've needed to scaffold/stub out the functions, and then verify GPT-4 generated code passes the appropriate tests.
I'm not scared. ChatGPT produces shitty and insecure code which nobody should just trust.
/edit: also, most of the stuff I do is so fringe that chatGPT probably doesn't know about. I'm currently upgrading our spring boot/security stuff to 3/6 and ACL is broken. There is no single sample or SO question on the internet for that. So how can chatGPT solve this? Answer: it won't.
> ChatGPT produces shitty and insecure code which nobody should just trust.
Today. What about a couple of years from now? We're seeing stuff that was sci-fi 5 years ago, who knows what we'll have in another 5 years? I think that pretty much everyone that's not very close to retirement and whose job is performed while sitting on a chair should be concerned about a not so distant future where their job either disappears or is transformed radically.
I am. And I fear developers using Copilot, ChatGPT etc. are helping to make this happen.
I'm not a luddite, actually quite the opposite, I'm all for the machines doing all the work. The thing is, I fear this is going to happen so quickly that even if governments and institutions had the intention to do something about it, they won't be able to do it in time. So whereas I wouldn't do anything to stop progress in this front, I also don't think I'd be willing to help get there (i.e. using Copilot and such).
I also fear that even if I don't lose my job it will change drastically, like the story, from Reddit I think, that made it to HN a few days ago where a passionate 3D artist was now prompting midjourney or whatever model and then doing some photoshop on top of it.
That last part is what I’m expecting: a lot of developers have been able to convince ourselves that our interests are aligned with upper management’s due to above-average pay and treatment, but that view is pretty heavily skewed one-way and a lot of companies resent the labor cost and lack of obsequiousness. Those places are going to see this as an opportunity to lower wages trying to turn the job into “just cleaning up what GPT creates”.
muskmusk|2 years ago
Look through the prompts he is using. Do they strike you as something a random person can produce? Not really. They show that Simon has excellent understanding of sqlite and how to do benchmarks. All ChatGPT does for him is speed up the typing. My experience has been that the quality of output is heavily correlated with the quality of input. Good clear prompts give good output. I postulate that if Simon had worse understanding of benchmarks and sqlite he would not have gotten as good output.
If you as a developer make your money by doing what ChatGPT is doing (turning clear instructions into working code) then you are going to be automated away. If you as a developer make your money by having a good understanding of the tools you use and by communicating that understanding clearly, then you have to type less in the future.
The silver lining is that even without GPT good understanding of tools and clear communication were always the more important skills to learn. ChatGPT just solidifies existing structure.
cleanchit|2 years ago
PhilipRoman|2 years ago
Unix was written in assembly with the Ed editor. Even Notepad and any modern language probably represents a 100x increase in productivity compared to that - much more than ChatGPT can do. The field of software engineering will only grow from this.
From what I've seen - most places have way more work than available programmers. Jira tickets stay unsolved for years. Maybe with the increased productivity we can actually clear our backlogs one day.
Szpadel|2 years ago
I already tried to approach one of those projects but I quickly failed as gpt knowledge was too outdated for the library that I needed to use and I didn't figure out how to "patch" gpt knowledge.
in this case I'm not very familiar with the stack I needed for that project ( and I think future versions of LLM could get better at bridging that gap) but for tasks that I'm more familiar with I noticed a significant increase in productivity
bugglebeetle|2 years ago
What’s it’s absolutely fantastic at is the small, one-off scripts that were tedious and time-consuming to write, integrating well-specified changes into existing code, tests, and boiler plate config stuff.
I know this is all “for now” talk, but it will be interesting to see how quickly or if at all it can get to production-ready code in the absence of pretty thorough review by someone who knows how stuff actually works. Natural language is a fantastic new interface, but (for now) you still have to know how to describe stuff that actually works.
discordance|2 years ago
RamblingCTO|2 years ago
/edit: also, most of the stuff I do is so fringe that chatGPT probably doesn't know about. I'm currently upgrading our spring boot/security stuff to 3/6 and ACL is broken. There is no single sample or SO question on the internet for that. So how can chatGPT solve this? Answer: it won't.
Xenoamorphous|2 years ago
Today. What about a couple of years from now? We're seeing stuff that was sci-fi 5 years ago, who knows what we'll have in another 5 years? I think that pretty much everyone that's not very close to retirement and whose job is performed while sitting on a chair should be concerned about a not so distant future where their job either disappears or is transformed radically.
qumpis|2 years ago
Xenoamorphous|2 years ago
I'm not a luddite, actually quite the opposite, I'm all for the machines doing all the work. The thing is, I fear this is going to happen so quickly that even if governments and institutions had the intention to do something about it, they won't be able to do it in time. So whereas I wouldn't do anything to stop progress in this front, I also don't think I'd be willing to help get there (i.e. using Copilot and such).
I also fear that even if I don't lose my job it will change drastically, like the story, from Reddit I think, that made it to HN a few days ago where a passionate 3D artist was now prompting midjourney or whatever model and then doing some photoshop on top of it.
acdha|2 years ago
bayindirh|2 years ago