Release engineer here. That's an excellent question, and we worry about it all the time. The AI "seems" authoritative, but it can't even add 1+1 sometimes :crying-emoji:. We've tried to engineer the prompts and tooling so that it will say "I don't know" if it doesn't know. But we've still seen it say some crazy things, like "Your cluster is fine" when it clearly wasn't. :tounge-sticking-out-emoji: I guess the only real answer is you have to trust but verify.
JimDabell|2 years ago
It’s difficult to take you seriously when you write like this about show-stopping bugs.
regiswilson|2 years ago
frankohn|2 years ago
I am crazy how they think a system with no feedback loop can be always accurate. Only perfect mathematics can work like this, any -like system need to have a feedback loop.
regiswilson|2 years ago
Michelangelo11|2 years ago
tommy_mcclung|2 years ago
Michelangelo11|2 years ago
say_it_as_it_is|2 years ago