(no title)
PeterWhittaker | 1 day ago
> Software used to be deterministic
Ah, someone fortunate enough to have never coded a heisenbug or trip over UB of various causes.
I've written plenty of well structured, well thought out mostly-deterministic software, then spent hours or days figuring what oversight summoned the gremlins.
(There is one low priority bug I've occasionally returned to over the last two-three years in case experience and back-burner musing may result in insight. Nope. Use gcc, no bug, use clang, bug, always, regardless of O level, debug level, etc. Everything else, all of it far more complex, works 100% reliably, it's just that one display update that fails.)
(It occurs to me that that is a bad example, because it IS deterministic, but none of us can pinpoint the "determiner".)
grayhatter|1 day ago
Assuming you're not tripping over some hardware defect, it sounds like you're using a gcc hack that llvm doesn't support
for a display update, sounds like memory ordering
PeterWhittaker|1 day ago
Once these bugs were fixed, things became deterministic, but to say that all software is deterministic is to assert some level of programming, build, and operational consistency that is often achievable with great effort.
Re gcc hacks: nope. No gcc'isms anywhere in the code, all warnings enabled, no warnings produced, just one case where a field is not updated in one very specific set of circumstances. Thanks for the suggestion, but that was one of the first things we thought of. There is a slight chance that it is actually a clang/llvm call stack depth bug, but the effort to reproduce that outweighs the impact of the bug, what with one thing and another not relevant here.
UB -> occasional non-determinism.