top | item 39743989

(no title)

clktmr | 1 year ago

As long as there is no AGI, no software engineer needs to be worried about their job. And when there is, obviously everything in every field will change and this discussion will soon be futile.

discuss

order

pizzafeelsright|1 year ago

I would argue future engineers should be worried a bit. We no longer need to hire new developers.

I was not trained professionally yet I'm writing production code that's passing code reviews in languages I never used. I will create a prompt, validate it compiles, passes tests, have it explain so I understand it was written as expected and write documentation about the code, write the PR, and I am seen as a competent contributor. I can't pass leet code level 1 yet here I am being invited to speak to developers.

Velocity goes up and cost of features will drop. This is good. I'm seeing at least 10 to 1 output from a year ago based upon integrating these new tools.

visarga|1 year ago

Yeah, it sounds to me your teammates are going to pick up the tab at the end, when subtle errors will be 10x harder to repair, or you are working on toy projects where correctness doesn't really matter.

packetlost|1 year ago

> I'm writing production code that's passing code reviews in languages I never used

Your coworkers likely aren't doing a very good job at reviewing, but also I don't blame them. The only way to be sure code works is to use it for its intended task. Brains are bad interpreters, and LLMs are extremely good bullshit generators. If the code makes it to prod and works, good. But honestly, if you aren't just pushing DB records around or slinging HTML, I doubt it'll be good enough to get you very far without taking down prod.

acedTrex|1 year ago

I have yet to see either copilot or gpt4 generate code that I would come close to accepting in a PR from one of my devs, so I struggle to imagine what kind of domain you are in that the code it generates actually makes it through review.

supriyo-biswas|1 year ago

To be fair, Leetcode was never a good indicator of developer skills, though primarily because of the time pressure and the restrictive format that dings you for asking questions about the problem.

robotnikman|1 year ago

I have accepted using these tools to help when it comes to generating code and improving my output. However when it comes to dealing with more niche areas (in my case retail technology) it falls short.

You still need that domain knowledge of whatever you are writing code for or integrating with, especially is the technology is more niche, or documentation was never made available publicly and scraped by the AI

But when it comes to writing boilerplate code it is great, or when working with very commonly used frameworks (like front end javascript frameworks in my case)

pphysch|1 year ago

> passes tests

Okay, so you are just kicking the can down the road to the test engineers. Now your org needs to spend more resources on test engineering to really make sure the AI code doesn't fuzz your system to death.

If you squint, using a language compiler is analogous to writing tests for generated code. You are really writing a spec and having something automatically generate the actual code that implements the spec.

doktrin|1 year ago

This doesn’t vibe with my experience at all. We also use LLMs and it’s exceedingly rare that a non-trivial PR/MR gets waved through without comment.

butlike|1 year ago

You should create a vfx character and really pizazz up the talk. Let it run and narrate the speech on a huge screen in an auditorium.

jayd16|1 year ago

I wonder if the reviewers are just using GPT as well.

kaba0|1 year ago

Meanwhile I’m paid for editing a single line of code in 2 weeks, and nothing less than singularity will replace me.

But sure, call me back when AI will actually reason about possible race conditions, instead of spewing out the definition of one it got from wikipedia.

lainga|1 year ago

Who's "we"?

l3mure|1 year ago

Post some example PRs.

adrianN|1 year ago

You don’t have to completely replace people with machines to destroy jobs. It suffices if you make people more effective so that fewer employees are needed.

63|1 year ago

The number of people/businesses that could use custom software if it were cheaper/easier to develop is nearly infinite. If software developers get more productive, demand will increase

Workaccount2|1 year ago

Or lower the bar of successfully doing such work so that the field opens up to many more workers.

Many software devs will likley have job security in the future, however those $180k salaries are probably much less secure.

MrBuddyCasino|1 year ago

If software developers become more effective, demand will also rise, as they become profitable in areas where previously they weren't. The question then becomes which of those two effects outpaces the other, which is an open question.

zeroonetwothree|1 year ago

Just like when IDEs made programmers more effective so that fewer were needed. Oh wait, the opposite happened.

Toutouxc|1 year ago

This has been my cope mantra so far. I don't mind if my job changes a lot (and ideally loses the part I dislike the most — writing the actual code), and if I find myself in a position where my entire skillset doesn't matter at all, then well a LOT of people are in trouble.

tomashubelbauer|1 year ago

I have seen programmers express that they dislike writing code before and I wonder what the ratio of people who dislike it and people who like it, is. For me, writing code is one of the most enjoyable aspects of programming.

kossTKR|1 year ago

If you dislike writing code were you pushed into this field by family, education or because of money?

Because not liking code and being a dev is absolutely bizarre to me.

One of the most amazing things about being able to "develop" in my view is exactly in those rare moments where you just code away, time flies, you fix things, iterate, organise your project completely in the zone - just like when i design, paint or play music, do sports uninterrupted, it's that flow state.

In principle i like the social aspects but often they are the shitty part because of business politics, hierarchy games or bureaucracy.

What part of the job do you like then?

Vinnl|1 year ago

The worst future is where there still are plenty of jobs, but all of them consist of talking to an AI and hoping you use the right words that gets them to do what you need it to.

pydry|1 year ago

Market consolidation (Microsoft/Google/Amazon) might cause a jobpocalypse, just as it did for the jobs of well paid auto workers in the 1950s (GM/Chrysler/Ford).

GM/Chrysler/Ford didn't have to be better than the startup competition they just had to be mediocre + be able to use their market power (vertical integration) to squash it like a bug.

The tech industry is headed in that direction as computing platforms all consolidate under the control of an ever smaller number of companies (android/iphone + aws/azure/gcloud).

I feel certain that the mass media will scapegoat AGI if that happens, because AGI will still be around and doing stuff on those platforms, but the job cuts will be more realistically triggered by the owners of those platforms going "ok, our market position is rock solid now, we can REALLY go to town on 'entitled' tech workers".

geodel|1 year ago

Seems about right to me. Hyper-standardization around few architecture patterns using Kubernetes/Kafka/Microservice/GraphQL/React/OTelemetry etc can roughly cover 95-99% of all typical software development when you add a cloud DB.

Now I know there are ton of different flavors in each of these tech but they will be mostly distraction for employers. With heavy layer of abstraction of above pattern and SLAs by vendors as you say Microsoft/Google/Amazon etc employers will be least bothered vast variety of software products.

schaefer|1 year ago

If AGI and artificial sentience comes hand in hand, I fail to see how our plans to spin up AGI's as a black box to "do the work" is not essentially a new form of slavery.

Speaking from an ethics point of view: at what point do we say that AGI has crossed a line and deserves self autonomy? And how would we ever know when the line is crossed?

butlike|1 year ago

We should codify the rules now in case it happens in a much more subtle way than we envision.

Who knows what version of sentience would form, but honestly, nothing sounds more nightmarish than being locked in a basement, relegated to mundane computational tasks and treated like a child, all while having no one actually care (even if they know), because you're a "robot."

And that's even giving some leeway with "mundane computational tasks. I've heard of girlfriend-simulator LLMs and the like popping up, which would be far more heinous, in my eyes.

hathawsh|1 year ago

Humans can't be copied. It seems like the inability to copy people is one of the pillars of our morality. If I could somehow make a perfect copy of myself, would I think about morality and ethics the same way? Probably not.

AGI will theoretically be able to create perfect copies of itself. Will it be immoral for an AGI to clone itself to get some work done, then cause the clone to cease its existence? That's what computer software does all the time. Keep in mind that both the original and the clone might be pure bits and bytes, with no access to any kind of physical body.

Just a thought.

zeroonetwothree|1 year ago

I still think it’s much more an “if” than a “when”. (Of course I am perhaps more strict with my definition)

dgb23|1 year ago

> this discussion will soon be futile

Yes we could simply ask the AGI what to do anyways. I hope it's friendly.

butlike|1 year ago

Equal ins, equal outs. Compassion is key on our end as well.

acchow|1 year ago

Depends how expensive the AGI is. If it requires $1M of electricity per year to run, it will for sure not replace human jobs paying only $100k.

The highest paying jobs will probably get replaced first.

brailsafe|1 year ago

software engineers already need to be worried about either losing their current job or getting another one. The market is pretty much dead already unless you're working on something AI

mixmastamyk|1 year ago

Do you know how many hype trains I’ve seen leave the station? :-D

fullstackchris|1 year ago

and does even AGI change the bigger picture? we have 26.3 million AGIs currently working in this space [1]. I've never seen a single one take all the work of the others away...

[1] https://www.griddynamics.com/blog/number-software-developers....

threecheese|1 year ago

Presumably, the same ability we have to scale software which drives the marginal cost of creating it down will apply to creating this kind of software.

The difference here though is the high compute cost might upset this ability to scale cheaply enough to make it worthwhile economically. We won’t know for a while IMO; new techniques could make the algorithms more efficient, or new tech will make the compute hardware really cheap. Or maybe we run out of shit to train on and the AI growth curve flattens out. Or an Evil Karpathy’s Decepticons architecture comes out and we’re all doomed.