(no title)
kayo_20211030 | 19 days ago
Fortran made that distinction clear. The compiler handled the accidental complexity of converting instructions to code, but never really obscured the boundary.
Take VB as an example from wayback. For the purposes of presenting a simple data-entry dialog, it removed the accidental complexity of dealing with Windows' message loop and resource files etc., which was painful. The essential complexity was in what the system did with the data. I suppose that the AI steering that needs to happen is to direct the essential down the essential path, and the accidental down the accidental path, and let a dev handle the former and the agent handle the latter (after all, it's accidental).
But, that'll take judgement - deciding in which camp each artifact exists and how it's managed. It might be a whole field of study, but it won't be new.
conartist6|19 days ago
If you give up on doing the work necessary to understand what is and is not critically important, you are no longer competent or responsible.
At that point the roles have switched and you are the mindless drone, toiling to serve AI.
https://strangestloop.io/essays/things-that-arent-doing-the-...
kayo_20211030|19 days ago
I don't agree with that. If I want to add two numbers I'd like to write `a = b + c`. I do not want to write the machine code that effects the same result on whatever computer architecture I'm targeting. Precisely _how_ one adds two numbers is accidental complexity. Whether they need to be added, and what numbers should be added, is essential complexity.
Fortran removed that accidental complexity and left the essential stuff in place. There were no fuzzy lines.