top | item 38325278

(no title)

doctorM | 2 years ago

For reference I think this probably partly explains my reluctance to use A.I. to help me code.

If I ask e.g. ChatGPT to just code something for me then the code it outputs is a black box, and there is no 'theory usage' in the parlance of the article. [Or I guess I'd have to recover the theory from the code it writes].

I've accepted by now that I'm putting myself at a disadvantage by not using A.I. at work however. Maybe another way to think about it would be that A.I. allows us to use our higher level theoretical understanding when we interact with codebase.

discuss

order

nuancebydefault|2 years ago

What seems to work quite well is, you ask the AI to build something that fulfills your requirements. Then you ask questions about the implementation until you understand it in detail. Meanwhile you ask it to improve portions of the code based on your insights and your results after trying it out. So the black box morphs into a mental model that the machine has helped you to attain.

ckdarby|2 years ago

ChatGPT provides a lot of the model through prompting to provide information of the "black box"