by Dr J. Lehr & T. Ciccone, Jan 5, 2021 in ClimateChangeDispatch
The accuracy and integrity of weather and climate measurements have always been a concern. However, errors and omissions were not as consequential in the past as they are now.
A hundred or even fifty years ago, our major weather concerns were more limited to local weather. When we have a flight from NYC to LAX, we need to know more detailed and reliable weather information, like is it snowing in St. Louis where we have a layover?
Or the farmer in Nebraska who needs to see the spring wheat production forecast in Ukraine. He needs the best possible information to better estimate the number of acres of winter wheat he should plant for today’s global markets.
We especially need better and more reliable information to decide what actions we should consider preparing for climate changes.
While scientists, engineers, and software programmers know the importance and the need for this data accuracy, the general public is not aware of how challenging these tasks can be.
When looking at long term climate data, we may have to use multiple proxies (indirect measures that we hope vary directly with weather), which add an extra layer of complexities, costs, and sources of error.
One of the most commonly used proxies is the ancient temperature and CO2 levels from ice core samples. Also, for the last few hundred years, tree-ring data was a primary source of annual temperatures.
by Steve McIntyre, June 30, 2014 in ClimateAudit
Nearly all of the text of this article on an interesting ice core proxy series (James Ross Island) from the Antarctic Peninsula was written in June 2014, but not finished at the time for reasons that I don’t recall. This proxy was one of 16 proxy series in the Kaufman 12K pdf. 60-90S reconstruction.
I originally drafted the article because it seemed to me that the then new James Ross Island isotope series exemplified many features of a “good” proxy according to ex ante criteria that I had loosely formulated from time to time in critiquing “bad” proxies, but never really codified (in large part, because it’s not easy to codify criteria except through handling data.)
Although this series is in the Kaufman 60-90S reconstruction, its appearance is quite different than the final 60-90S reconstruction: indeed, it has a very negative correlation (-0.61) to Kaufman’s final CPS reconstruction. I’ll discuss that in a different article.
Following is mostly 2014 notes, with some minot updating for context.
I’ve articulated with increasing clarity over the years (but present in early work as well) – is that one needs to work outward from proxies that are “good” according to some ex ante criteria, rather than place hope in a complicated multivariate algorithm on inconsistent and noisy data, not all of which are “proxies” for the item being reconstructed. This is based on principles that I’ve observed in use by geophysicists and geologists to combine “good” (high resolution) data with lower quality data.
by P. Homewood, August, 1, 2019 in NotaLotofPeopleKnowThat
Some ancient history
Fifteen to twenty years ago, Michael Mann and colleagues wrote a few papers claiming that current warming was unprecedented over the last 600 to 2000 years. Other climate scientists described Mann’s work variously as crap, pathetic, sloppy, and crap. These papers caught the interest of Stephen McIntyre and this led to the creation of his Climate Audit blog and the publication of paperspointing out the flaws in these hockey stick reconstructions. In particular, Mcintyre and his co-author Ross McKitrick showed that the method used by Mann and colleagues shifted the data in such a way that any data sets that showed an upward trend in the 20th century would receive a stronger weighting in the final reconstruction. With this method, generation of a hockey-stick shape in the temperature reconstruction was virtually guaranteed, which M&M demonstrated by feeding in random numbers to the method.
by Alex Barral et al., 2017 (U. Lyon-CNRS)
La comparaison des fluctuations du CO2 atmosphérique retracées à partir de ces estimations avec des courbes des changements de température a révélé de fortes baisses du CO2 atmosphérique (200-300 ppm), couplées à de fortes hausses de la température moyenne à la surface du globe (5-8°C) à l’échelle de quelques millions d’années.
by Prof. Quansheng Ge, August 8, 2017 in ClimateChangeDispatch
Prof. Quansheng Ge and his group from the Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, collected a large number of proxies and reconstructed a 2000-year temperature series in China with a 10-year resolution, enabling them to quantitatively reveal the characteristics of temperature change in China over a common era.
See also here
by Javier, July 11, 2017 in ClimatEtc.
In our attempt to better understand the nature of our planet’s abrupt climate changes I have already reviewed the glacial-interglacial cycle, and the Dansgaard-Oeschger cycle’s that take place during glacial periods. I now start reviewing the millennial climate cycles that abruptly impact the slowly changing Holocene climate. The most significant and regular one is the ~ 2400-year Bray cycle.