(no title)
alpineman | 1 month ago
Imagine if as a dev someone came to you and told you everything that is wrong with your tech stack because they copy pasted some console errors into ChatGPT. There's a reason doctors need to spend almost a decade in training to parse this kind of info. If you do the above then please do it with respect for their profession.
FeteCommuniste|1 month ago
My wife is a lawyer and sees the same thing at her job. People "writing" briefs or doing legal "research" with GPT and then insisting that their document must be right because the magic AI box produced it.
tripledry|1 month ago
When reading news stories on topics you know well, you notice inaccuracies or poor reporting - but then immediately forget that lesson when reading the next article on a topic you are not familiar with.
It's very similar to what happens with AI.
mr_toad|1 month ago
“A little knowledge is a dangerous thing” is not new, it’s a quote/observation that goes back hundreds of years.
> Imagine if as a dev someone came to you and told you everything that is wrong with your tech stack because they copy pasted some console errors into ChatGPT.
You mean the PHB? They don’t need ChatGPT for that, they can cite Gartner.