top | item 41991531

(no title)

abadpoli | 1 year ago

This presumes that it will be real humans that have to “take care” of the code later.

A lot of the people that are hawking AI, especially in management, are chasing a future where there are no humans, because AI writes the code and maintains the code, no pesky expensive humans needed. And AI won’t object to things like bad code style or low quality code.

discuss

order

tivert|1 year ago

Well that will work great if you let the AI decide if the code is working or not.

User: This is calculating the result wrong.

AI: CLOSED WONTFIX: WORKING AS DESIGNED.

DeathArrow|1 year ago

>AI writes the code

AI will never write proper code unless guided by someone who knows how to properly code and how to properly translate business needs into code.

TheNewsIsHere|1 year ago

> [...] business needs into code.

I think this is where we lose a lot of developers. My experience has been this is a skill set that isn’t as common as you’d hope for, and requires some experience outside developing software as its own act. In other words, this doesn’t seem to be a skill that is natural to developers who haven’t had (or availed themselves of) the opportunity to do the business analyst and requirements gathering style work that goes into translating business needs to software outcomes. Many developers are isolated (or isolate themselves) from the business side of the business. That makes it very difficult to be able to translate those needs themselves. They may be unaware of and not understand, for example, why you’d want to design an accounting feature in a SaaS application in a certain way to meet a financial accounting need.

On the flip side, non-technical management tends to underestimate and undervalue technical expertise either by ego or by naïveté. One of my grandmothers used to wonder how I could “make any money by playing on the computer all day,” when what she saw as play was actually work. Not video games, mind you. She saw computers as a means to an end, in her case entertainment or socializing. Highly skilled clients of mine, like physicians, while curious, are often bewildered that there are sometimes technical or policy limitations that don’t match their expectations and make their request untenable.

When we talk about something like an LLM, it simply doesn’t possess the ability to reason, which is precisely what is needed for that kind of work.

emptiestplace|1 year ago

Are you familiar with the idea of consciousness as an emergent property?

netdevnet|1 year ago

You know this future isn't happening anytime soon. Certainly not in the next 100 years. Until then, humans will be taking care of it and no one will want to work at a place working on some Fransketeinian codebase made via an LLM. And even when humans are only working on 5% of the codebase, that will likely be the most critical bit and will have the same problems regarding staff recruitment and retention.

jkestner|1 year ago

All you got to do is write the unit tests and let the AI evolve the code, right??