Archives par mot-clé : Mathematics

On climate change, the uncertainties multiply— literally.

by Michael Bernstam, July 3, 2017 in GWPF


The following four stipulations must each be highly probable: 

1. Global warming will accumulate at 0.12 degrees Celsius or higher per decade.

2. It is anthropogenic, due largely to carbon dioxide emissions.

3. The net effect is harmful to human well-being in the long run.

4. Preventive measures are efficient, that is, feasible at the costs not exceeding the benefits.

But even if the probability of each of these stipulations is as high as 85 percent, their compound probability is as low as 50 percent. This makes a decision to act or not to act on climate change equivalent to flipping a coin.

The Laws of Averages: Part 2, A Beam of Darkness

by Kip Hansen, June 19, 2017 in WUWT


As both the word and the concept “average” are subject to a great deal of confusion and misunderstanding in the general public and both word and concept have seen an overwhelming amount of “loose usage” even in scientific circles, not excluding peer-reviewed journal articles and scientific press releases,  I gave a refresher on Averages in Part 1 of this series.  If your maths or science background is near the great American average, I suggest you take a quick look at the primer in Part 1 before reading here.

The Meaning and Utility of Averages as it Applies to Climate

by Clyde Spencer, April 23, 2017


By convention, climate is usually defined as the average of meteorological parameters over a period of 30 years. How can we use the available temperature data, intended for weather monitoring and forecasting, to characterize climate? The approach currently used is to calculate the arithmetic mean for an arbitrary base period, and subtract modern temperatures (either individual temperatures or averages) to determine what is called an anomaly. However, just what does it mean to collect all the temperature data and calculate the mean?

Are Claimed Global Record-Temperatures Valid?

by Clyde Spencer, April 12, 2017


In summary, there are numerous data handling practices, which climatologists generally ignore, that seriously compromise the veracity of the claims of record average-temperatures, and are reflective of poor science. The statistical significance of temperature differences with 3 or even 2 significant figures to the right of the decimal point is highly questionable. One is not justified in using the approach of calculating the Standard Error of the Mean to improve precision, by removing random errors, because there is no fixed, single value that random errors cluster about. The global average is a hypothetical construct that doesn’t exist in Nature. Instead, temperatures are changing, creating variable, systematic-like errors. Real scientists are concerned about the magnitude and origin of the inevitable errors in their measurements.

Also : Perspective Needed; Time to Identify Variations in Natural Climate Data that Exceed the Claimed Human CO2 Warming Effect

The Logarithmic Effect of Carbon Dioxide

by David Archibald, March 8, 2010


The greenhouse gasses keep the Earth 30° C warmer than it would otherwise be without them in the atmosphere, so instead of the average surface temperature being -15° C, it is 15° C. Carbon dioxide contributes 10% of the effect so that is 3° C. The pre-industrial level of carbon dioxide in the atmosphere was 280 ppm. So roughly, if the heating effect was a linear relationship, each 100 ppm contributes 1° C. With the atmospheric concentration rising by 2 ppm annually, it would go up by 100 ppm every 50 years and we would all fry as per the IPCC predictions.

But the relationship isn’t linear, it is logarithmic. In 2006, Willis Eschenbach posted this graph on Climate Audit showing the logarithmic heating effect of carbon dioxide relative to atmospheric concentration