by Dr J. Lehr & T. Ciccone, Jan 5, 2021 in ClimateChangeDispatch
The accuracy and integrity of weather and climate measurements have always been a concern. However, errors and omissions were not as consequential in the past as they are now.
A hundred or even fifty years ago, our major weather concerns were more limited to local weather. When we have a flight from NYC to LAX, we need to know more detailed and reliable weather information, like is it snowing in St. Louis where we have a layover?
Or the farmer in Nebraska who needs to see the spring wheat production forecast in Ukraine. He needs the best possible information to better estimate the number of acres of winter wheat he should plant for today’s global markets.
We especially need better and more reliable information to decide what actions we should consider preparing for climate changes.
While scientists, engineers, and software programmers know the importance and the need for this data accuracy, the general public is not aware of how challenging these tasks can be.
When looking at long term climate data, we may have to use multiple proxies (indirect measures that we hope vary directly with weather), which add an extra layer of complexities, costs, and sources of error.
One of the most commonly used proxies is the ancient temperature and CO2 levels from ice core samples. Also, for the last few hundred years, tree-ring data was a primary source of annual temperatures.