It's not great for blindly generating code to use, but it's great for asking for help -debugging, planning etc. I ask it a lot of 'dumb' questions and I feel like I've learnt a lot more than I have in the past. It's given me code that doesn't work every now and then, but I'm yet to find an instance of it explaining a concept wrong.
waste_monk|1 year ago
How do you know for sure? LLMs output is often plausible-sounding but incorrect - usually it's fairly obvious, but it can be subtle enough that I would not suggest using it until you've learned the old fashioned way and can better judge whether the LLM is wrong.
ipaddr|1 year ago