Die kalte Sonne reports on a new aerosol study by Liu et al.
The results are a major blow to the high greenhouse-gas climate sensitivity modelers.
IPCC scientists have a favorite wild card they often use to explain serious model discrepancies: aerosols. Mysterious cooling events in the past are often explained away by aerosols from major volcanic eruptions, for example. They act to filter out sunlight.
ccording to IPCC climate models, the mean global temperature should have risen by 1.5°C since 1850 due to the higher CO2 concentrations. But best estimates show that it has instead risen by only 1.1°C. So what about the missing 0.4°C?
Naturally, the missing 0.4°C of warming since 1850 gets explained by the higher 20th century aerosol levels in the atmosphere – due to the burning of fossil fuels. Air pollution by man over the course of the late 19th century and entire 20th century are said to have dimmed the earth, and thus this explains the 0.4°C less warming.
Surprise: global aerosol emissions have been flat over past 250 years
But now results by a new study appearing in the journal Science Advances by Liu et al tells us that the forcing by aerosols had to have been overestimated by climate modelers. IPCC modelers insisted that 20th century aerosol concentrations were higher than during the pre-industrial times, and this is what kept the climate from warming by 1.5°C.
According to the scientists led by Liu, however, atmospheric aerosols in the preindustrial times were just as high as they were just recently. They were in fact more or less constant over the past 250 years. No change means it could not have been aerosols putting the brakes on temperature rise:
That’s a real embarrassment for the IPCC modelers. It means CO2 climate sensitivity has been overestimated.
In Part 1, we introduced the concepts of climate sensitivity to CO2, often called ECS or TCR. The IPCC prefers a TCR of about 1.8°C/2xCO2 (IPCC, 2013, p. 818). TCR is the short-term, century scale, response of surface temperature to a doubling of CO2, we abbreviate the units as “°C/2xCO2.” In these posts we review lower estimates of climate sensitivity, estimates below 1°C/2xCO2. In parallel, we also review estimates of the surface air temperature sensitivity (SATS) to radiative forcing (RF, the units are °C per W/m2 or Watts per square meter). The IPCC estimates this value to be ~0.49°C per W/m2.
The previous post discussed two modern climate sensitivity estimates, by Richard Lindzen and Willie Soon, that range below 1°C/2xCO2. Next, we review climate sensitivity estimates by Sherwood Idso, Reginald Newell and their colleagues.
Many comments to part 1 tried to discredit the “ECS” or “TCR” estimates made by Lindzen and Soon, completely missing their point and my point. ECS and TCR are artificial climate model constructs, with little meaning outside the confines of computer modeling. TCR is a little more realistic since we might be able to observe or measure something close to it over the next century. But ECS, or the “Equilibrium Climate Sensitivity” is a totally abstract and unworldly number that could never be measured. It means if CO2 doubled suddenly, and nothing else changed for several hundred years while the oceans came into equilibrium with the new surface air temperature, what would the final surface temperature be? Air temperature would never be close to equilibrium for several hundred years, even 70 to 100 years (TCR) is a stretch.
Climate models are not the real world and the numbers that come out of them, like ECS or TCR, can be useful for showing the likely direction of temperature movement in response to changes in parameters or different model scenarios, but the numbers themselves are meaningless unless the models have previously been validated against the real world. With the possible exception of the Russian INM-CM4 model, no other IPCC model has successfully predicted future global surface temperatures. Ron Clutz discusses INM-CM4 here.
Model calculations are not observations. ECS and TCR are not real numbers, real numbers are based on observations. Thus, the model extracted values of ECS and TCR are not information, they can be used to detect the direction of change in climate forcing, if the climate model is an accurate reflection of that portion of the real world. The direction of movement of ECS and TCR, when model parameters or data tables change, is the information, not the computed value. I’m often amazed, as a former petrophysical modeler of 42 years, how often otherwise intelligent people confuse unvalidated model calculations with observations.
A new climate study has dismissed utterly implausible high end climate models. But the new study also seeks to raise the low end of the range of estimated climate sensitivity into the discomfort zone.
The treatment of cloud feedback is interesting. The study acknowledges large cloud feedback uncertainties, mentions the Lindzen et al. (2001) “iris effect”, and admits GCMs cannot be trusted to reproduce observed cloud response, yet still appears to attempt to derive a cloud feedback factor based on satellite observations, and mix this observational cloud factor with model predictions.
Even worse than we thought ™. Despite a recent sanity test study which demonstrated that high end climate models hindcast impossible Eocene temperatures, climate scientists are pushing ahead anyway with their new, even more extreme climate projections.
Within the last few years, over 50 papers have been added to our compilation of scientific studies that find the climate’s sensitivity to doubled CO2 (280 ppm to 560 ppm) ranges from <0 to 1°C. When no quantification is provided, words like “negligible” are used to describe CO2’s effect on the climate. The list has now reached 106 scientific papers.