(no title)
potkin | 1 year ago
I guess on HN we're all relatively pro-science. But the world would be a better place if we recognised that our scientific knowledge in some areas is poorer than we like to pretend.
I started feeling that way when I worked alongside some "Evidence-Based Medicine" advocates. Years later I've landed in data science and the standards of statistical analysis and understanding I see especially in the biological sciences has only made me more sceptical.
Way back in 2007 the BMJ as part of its Clinical Evidence project published its systematic research into standards of evidence in support of common medical treatments. Some 2500 treatments were evaluated to determine whether they are supported by sufficient reliable evidence.
• 13% were found to be beneficial. • 23% were likely to be beneficial. • 8% were as likely to be harmful as beneficial. • 6% were unlikely to be beneficial. • 4% were likely to be harmful or ineffective. • 46% were unknown whether they were efficacious or harmful.
It's quite hard to find the original Clinical Evidence project resources (might be a job for the wayback machine) but you can find it referenced all over, e.g. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2071976/ .
In the 1970s the US Office of Technology Assessment conducted a similar evaluation of medical treatments' efficacy and found that only 10% to 20% of medical treatments had evidence of efficacy. I would love to see more recent research in this vein.
There are clearly many complications and caveats around all this. Not all "common treatments" are easily studied -- the gold standard of the RCT is not always feasible or ethical. And of course absence of evidence is not evidence of absence, etc... but it sometimes feels we should be a bit more humble about even our best science.
No comments yet.