(no title)
temphn | 12 years ago
Over the last decades, numerous researchers have
painstakingly collected, analyzed, dated, and calibrated
many data series that allow us to reconstruct climate
before the age of direct measurements. Such data come e.g.
from sediment drilling in the deep sea, from corals, ice
cores and other sources. Shaun Marcott and colleagues for
the first time assembled 73 such data sets from around the
world into a global temperature reconstruction for the
Holocene, published in Science. Or strictly speaking, many
such reconstructions: they have tried about twenty
different averaging methods and also carried out 1,000
Monte Carlo simulations with random errors added to the
dating of the individual data series to demonstrate the
robustness of their results.
Marcott et al.'s graphic states that their model reflects historical temperatures so accurately that it can measure the average temperature of the entire globe continuously back to 8000 years ago, to within a small fraction of 1 degree Celsius (i.e. the 1-sigma error bars). That is simply an extraordinary claim given:1) The "divergence problem". The lack of correlation between model inputs like tree rings and the instrumental record over the last few decades is acknowledged by all; climate scientists generally state that it is due to anthropogenic factors, arguably assuming the consequent.
http://www.skepticalscience.com/Tree-ring-proxies-divergence...
2) Serious climate model prediction failures over the past 10 year period, as acknowledged in Nature:
http://www.nature.com/news/climate-change-the-forecast-for-2...
In other words, key model inputs used in climate reconstructions do not strongly correlate with the instrumental record over the last 30-40 years ("the divergence problem") and climate models have so far had a poor track record over the last 15 years, with average temperatures winding up below the envelope of model predictions. These predictive failures in the datasets we can check bode ill for the prospect of hindcasting global average temperatures to within 1 degree more than 8000 years ago.
asgard1024|12 years ago
I also think you misrepresent the 2nd article. The question is what and on what time scale are you trying to predict. I believe they are talking about more precision more short-term models. Just like we can predict winter and not predict weather, we can predict warming due to human forcing, but not the specific details.
In the end, however, it's completely irrelevant to AGW if there was a higher temperature in the past or not. The theory of AGW doesn't stand just on that argument (nor any other single argument, for that matter).
thaumasiotes|12 years ago
jdgiese|12 years ago
saalweachter|12 years ago
One is that this is based on other people's work, as the GP mentioned. Updates to those works will affect this one; the hope is that those 73 underlying datasets aren't systematically biased and any errors will cancel out. Another is that those error bars are probably 95% confidence. So we fully expect that about 550 out of the 11,000 years in this chart will fall outside of that range. A final bit is that we're only reconstructing averages here; it's a lot easier to guess the average number of shoes owned by 1000 people than the exact number of shoes owned by 1.
These extra caveats don't invalidate the data or render it useless, they just qualify it. You shouldn't look at the graph and think, "Here is the exact temperature for the last 11,000 years", you should think, "Given our current best understanding of the available data, the average global temperature for 10,450 out of the last 11,000 years probably fell into this 0.4 degree C range."