I think you've made a fundamental mistake. Whether or not someone understands software is not based on their job title or their desires. Nor does typing to an anthropomorphized language model expand the understanding of software. It may provide the material necessary to help someone learn, but learning is a change in behavior as a result of experience. You must fail at something in order to prevail. Using LLMs to work around failures without understanding how they occurred and why those failures were possible will not provide learning, but instead prompt the same behavior: Asking an LLM. Same behavior, same result.
No comments yet.