(no title)
9dev | 1 day ago
Given all that, I just cannot ignore AI as a development tool. There is no good justification I can give the rest of the company for why we would not incorporate AI tools into our workflows, and this also means I cannot leave it up to individual developers on whether they want to use AI or not.
This pains me a lot: On the one hand, it feels irresponsible to the junior developers and their education to let them outsource thinking; on the other hand, we're not a charity fund but a company that needs to make money. Also, many of us (me included) got into this career for the joy of creating. Nobody anticipated this could stop being part of the deal, but here were are.
ajshahH|1 day ago
Is there definitive proof of long term productivity gains with no detriment to defects, future velocity, etc?
If so I’d say you’re irresponsible at best to put this much trust in a tool that’s been around for a few months (at the current level). Absolutely encourage experimentation, but there’s a trillion dollar marketing hype machine in overdrive right now. Your job is to remind people of that.
jorvi|1 day ago
wrs|1 day ago
I’m sure you’re smarter than that, but a lot of leaders aren’t. And that’s based on the past, when they had an established playbook they could choose to follow, not the situation we’re in now where you have to make it up as you go.
9dev|1 day ago
throwaway346434|1 day ago
Consider what a job with no joy means for the ongoing mental health of your staff, where the main interaction they have all day is with an AI model that the person has to boss around; with little training on norms. Depression, frustration, nonchalance, isolation, and corner cutting are going to be the likely responses.
So at the same time as you introduce new tooling, introduce the quality controls you would expect for someone utterly checked out of the process, and the human resources policies or prevention to avoid your team speed running Godwin's law because they dont deal with people enough to remember social niceties are important.
Examples off of the top of my head of ways to do this are: - Increased socialisation in the design processes. Mandatory fun sucks, a whiteboard party and collaboration will bring some creativity and shared ownership. - Budget for AI minimal or free periods, where the intent is to do a chunk of work "the hard way"; and have people share what they experienced or learnt - Make people test each other's work (manual testing) or collaborate, otherwise you will have a dysfunctional team who reaches for "yell in all caps to make sure the prompt sticks" as the way people talk to each other/deal with conflict.
The way to justify this to management above you is the cost of staff retention - advertise, interview, hire, pay market rates, equip, train, followed 6 months later by securely off boarding, hardware return, exit interview means you get maybe 4 months productivity out of each person, and pay 2 months salary in all of the early job mistakes or late job not caring, or HR debacle. Do you or your next level up want to spend 30% more time doing this process? Or would you rather focus on generating revenue with a team that works well together and are on board for the long term?
The answer most of the time is "we want to make money, not spend it". So do the math on what staff replacement costs are and then argue for building in enough slack to the process that it costs about half of that to maintain it/train the staff/etc.
Your company is now making a "50% efficiency gain" in the HR funnel, year over year, all by simply... not turning the dial up to 10 on forced AI usage.
Framed like that, sounds a lot better doesn't it?
9dev|1 day ago
Having said that: The dichotomy expressed in the threads here is a bit too extreme for my taste. It's not like working with AI is pure Yes-clicking review dread; there is joy to be found in materialising your ideas out of thin air, instead of the Lego-like puzzle solving experience many developers are used to.
And as mentioned in TFA, There's risk in both using it too little and too much. This also applies to employees, of course: If I shielded junior developers from AI tools, they'd end up in their next job utterly unprepared for what may be required from them as the world keeps spinning.
> Framed like that, sounds a lot better doesn't it?
Sure does, but that's not the situation I'm in. I'm trying to figure out the local maximum of keeping my company afloat in a world where AI has kicked the PMF from under our feet to the other end of the playing field, and ensuring my team stays happy, curious, and engaged. And I'm not the only one in this spot, I suppose.
lstodd|1 day ago
And it is. You are going to end up with a wreck of a product and not a single person you can call upon to fix it. It is your choice and you will pay for it.
ako|1 day ago
slopinthebag|1 day ago
The concern as well is that by forcing the AI onto developers, they eventually throw their hands up and say "well they dont care about code quality anymore, neither should I" and start shipping absolute vibeslop.
9dev|1 day ago
It's not that the team isn't functioning, it's that it's a pretty diverse team in terms of experience, which means things just used to take a while to finish.
> The concern as well is that by forcing the AI onto developers, they eventually throw their hands up and say "well they dont care about code quality anymore, neither should I" and start shipping absolute vibeslop.
This is IMHO avoidable by emphasising code reviews and automated tooling; my general policy is still that everyone is responsible for what they push, period. So absolute vibeslop isn't what I'm seeing, rather an efficiency miscalculation on which parts should be written by humans and which by the AI.
nz|1 day ago
I know, from second hand experience, that long before coding LLMs became a thing, engineers would ship slop when it became clear that their superiors cared about deadlines uber alles (i.e. not shipping slop would be the same thing as quitting, but without the paycheck -- slop code is often a form of quiet quitting).
Most people would _prefer_ to be able to "program" their entire business from a spreadsheet. LLMs have enabled them to get involved, and they cannot understand why engineers reject this "help" (it is for the same reason that a pilot would reject a copilot that thinks he knows how to fly because he played a flight simulator or read Jonathan Livingston Seagull; flight simulators are used in training too, but they are not a substitute for actual piloting experience). This refusal and resistance feeds into the mistrust and resentment. We live in a world where managers and administrators do not understand what they are managing and administrating, nor do they think that this is part of their job description. In the worst cases, they believe their job is to extract compliance from their subordinates.
There is a _lot_ of alpha in being part of a company, where authorities understand how the internals of the business (including software and IT!) _actually_ function. (One engineer told me that clueless yet demanding managers are, for all intents and purposes, unwitting saboteurs, and that the best a company can do about this is get him a job interview at a competitor). In some sense, the economy is just a machine for transferring wealth from those who do not know something essential, to those who do know something essential. This can veer uncomfortably close to exploitation. If we want to avoid crossing that line, we need to cultivate an economy where a lack of understanding is not seen as an _opportunity for profit_, but rather _as an opportunity for illumination_.
throwaway613746|1 day ago
[deleted]