(no title)
cm3 | 9 years ago
That said, our weather models are pretty good but not good enough to make certain predictions that far away into the future, but they can for the next few hours.
It's like a software company's model of code branches. The Apple/Google/whatever filesystem team works on something, it gets pushed into their level of production branch, then it percolates up to the shared production kernel branch, and after a couple more layers it hits the common branch, which is what public production binaries are made from and consists of kernel, userland, foobar modules all merged together. Not all software shops operate this way, but it's what size of a project can demand after it hits certain amount. The linux kernel works this way too, to name a successful non-commercial project. You can argue this doesn't prevent regressions, and that's true, but it's hard to deny there would be more regressions (aka false reporting) with unfiltered (aka unvetted) reporting.
robbrown451|9 years ago
I still don't believe the term "100% certain" is meaningful. Maybe if they were to put a label on certain facts: "This fact is considered by our editorial board to be 93% certain." And maybe have a chart, so that figure can change over time.
I think there are better ways, and that there should be accountability. I just am not in favor of black and white terms for concepts that, to me, are purely shades of gray.