Isnt one of the ways of solving the problem using all the tools at your disposal? If at the end of the day, isnt having working code the fundamental goal?
I guess you could argue that the code needs to be efficient, stable, and secure. But if you could use "AI" to get part way there, then use smarts to finish it off. Isnt that reasonable? (Devils advocate)
The other big question is the legality of using code from an AI in a final commercial product.
dahart|1 year ago
Keep in mind that the amount of time you spend in a real job solving clear and easy interview style problems that an LLM can answer is tiny to none. Jobs are most often about juggling priorities and working with other people and under changing conditions, stuff Claude and ChatGPT can’t really help you with. Your personality is way more important to your job success than your GPT skills, and that’s what interviewers want to see… your personality & behavior when you don’t know the right answer, not ChatGPT’s personality.