Archives par mot-clé : Mathematics

Durable Original Measurement Uncertainty

by Kip Hansen, October 14, 2017 in WUWT


Temperature and Water Level (MSL) are two hot topic measurements being widely bandied about and vast sums of money are being invested in research to determine whether, on a global scale, these physical quantities — Global Average Temperature and Global Mean Sea Level — are changing, and if changing, at what magnitude and at what rate. The Global Averages of these ever-changing, continuous variables are being said to be calculated to extremely precise levels — hundredths of a degree for temperature and millimeters for Global Sea Level — and minute changes on those scales are claimed to be significant and important.

Statistical link between external climate forcings and modes of ocean variability

by Abdul Malik et al., July 31, 2017, Climate Dynamics, Springer


In this study we investigate statistical link between external climate forcings and modes of ocean variability on inter-annual (3-year) to centennial (100-year) timescales using de-trended semi-partial-cross-correlation analysis technique. To investigate this link we employ observations (AD 1854–1999), climate proxies (AD 1600–1999), and coupled Atmosphere-Ocean-Chemistry Climate Model simulations with SOCOL-MPIOM (AD 1600–1999). We find robust statistical evidence that Atlantic multi-decadal oscillation (AMO) has intrinsic positive correlation with solar activity in all datasets employed. The strength of the relationship between AMO and solar activity is modulated by volcanic eruptions and complex interaction among modes of ocean variability.

On climate change, the uncertainties multiply— literally.

by Michael Bernstam, July 3, 2017 in GWPF


The following four stipulations must each be highly probable: 

1. Global warming will accumulate at 0.12 degrees Celsius or higher per decade.

2. It is anthropogenic, due largely to carbon dioxide emissions.

3. The net effect is harmful to human well-being in the long run.

4. Preventive measures are efficient, that is, feasible at the costs not exceeding the benefits.

But even if the probability of each of these stipulations is as high as 85 percent, their compound probability is as low as 50 percent. This makes a decision to act or not to act on climate change equivalent to flipping a coin.

The Laws of Averages: Part 2, A Beam of Darkness

by Kip Hansen, June 19, 2017 in WUWT


As both the word and the concept “average” are subject to a great deal of confusion and misunderstanding in the general public and both word and concept have seen an overwhelming amount of “loose usage” even in scientific circles, not excluding peer-reviewed journal articles and scientific press releases,  I gave a refresher on Averages in Part 1 of this series.  If your maths or science background is near the great American average, I suggest you take a quick look at the primer in Part 1 before reading here.

The Meaning and Utility of Averages as it Applies to Climate

by Clyde Spencer, April 23, 2017


By convention, climate is usually defined as the average of meteorological parameters over a period of 30 years. How can we use the available temperature data, intended for weather monitoring and forecasting, to characterize climate? The approach currently used is to calculate the arithmetic mean for an arbitrary base period, and subtract modern temperatures (either individual temperatures or averages) to determine what is called an anomaly. However, just what does it mean to collect all the temperature data and calculate the mean?

Are Claimed Global Record-Temperatures Valid?

by Clyde Spencer, April 12, 2017


In summary, there are numerous data handling practices, which climatologists generally ignore, that seriously compromise the veracity of the claims of record average-temperatures, and are reflective of poor science. The statistical significance of temperature differences with 3 or even 2 significant figures to the right of the decimal point is highly questionable. One is not justified in using the approach of calculating the Standard Error of the Mean to improve precision, by removing random errors, because there is no fixed, single value that random errors cluster about. The global average is a hypothetical construct that doesn’t exist in Nature. Instead, temperatures are changing, creating variable, systematic-like errors. Real scientists are concerned about the magnitude and origin of the inevitable errors in their measurements.

Also : Perspective Needed; Time to Identify Variations in Natural Climate Data that Exceed the Claimed Human CO2 Warming Effect

The Logarithmic Effect of Carbon Dioxide

by David Archibald, March 8, 2010


The greenhouse gasses keep the Earth 30° C warmer than it would otherwise be without them in the atmosphere, so instead of the average surface temperature being -15° C, it is 15° C. Carbon dioxide contributes 10% of the effect so that is 3° C. The pre-industrial level of carbon dioxide in the atmosphere was 280 ppm. So roughly, if the heating effect was a linear relationship, each 100 ppm contributes 1° C. With the atmospheric concentration rising by 2 ppm annually, it would go up by 100 ppm every 50 years and we would all fry as per the IPCC predictions.

But the relationship isn’t linear, it is logarithmic. In 2006, Willis Eschenbach posted this graph on Climate Audit showing the logarithmic heating effect of carbon dioxide relative to atmospheric concentration