Archives par mot-clé : Model(s)

Climate: about which temperature are we talking about?

by S. Furfari and H. Masson, July 26, 2019 in ScienceClimateEnergie


Is it the increase of temperature during the period 1980-2000 that has triggered the strong interest for the climate change issue? But actually, about which temperatures are we talking, and how reliable are the corresponding data?

1/ Measurement errors

Temperatures have been recorded with thermometers for a maximum of about 250 years, and by electronic sensors or satellites, since a few decades. For older data, one relies on “proxies” (tree rings, stomata, or other geological evidence requiring time and amplitude calibration, historical chronicles, almanacs, etc.). Each method has some experimental error, 0.1°C for a thermometer, much more for proxies. Switching from one method to another (for example from thermometer to electronic sensor or from electronic sensor to satellite data) requires some calibration and adjustment of the data, not always perfectly documented in the records. Also, as shown further in this paper, the length of the measurement window is of paramount importance for drawing conclusions on a possible trend observed in climate data. Some compromise is required between the accuracy of the data and their representativity.

2/ Time averaging errors

If one considers only “reliable” measurements made using thermometers, one needs to define daily, weekly, monthly, annually averaged temperatures. But before using electronic sensors, allowing quite continuous recording of the data, these measurements were made punctually, by hand, a few times a day. The daily averaging algorithm used changes from country to country and over time, in a way not perfectly documented in the data; which induces some errors (Limburg, 2014) . Also, the temperature follows seasonal cycles, linked to the solar activity and the local exposition to it (angle of incidence of the solar radiations) which means that when averaging monthly data, one compares temperatures (from the beginning and the end of the month) corresponding to different points on the seasonal cycle. Finally, as any experimental gardener knows, the cycles of the Moon have also some detectable effect on the temperature (a 14 days cycle is apparent in local temperature data, corresponding to the harmonic 2 of the Moon month, Frank, 2010); there are circa 13 moon cycle of 28 days in one solar year of 365 days, but the solar year is divided in 12 months, which induces some biases and fake trends (Masson, 2018).

3/ Spatial averaging

Figs. 12, 13 and 14 : Linear regression line over a single period of a sinusoid.

 

Conclusions

 

  1. IPCC projections result from mathematical models which need to be calibrated by making use of data from the past. The accuracy of the calibration data is of paramount importance, as the climate system is highly non-linear, and this is also the case for the (Navier-Stokes) equations and (Runge-Kutta integration) algorithms used in the IPCC computer models. Consequently, the system and also the way IPCC represent it, are highly sensitive to tiny changes in the value of parameters or initial conditions (the calibration data in the present case), that must be known with high accuracy. This is not the case, putting serious doubt on whatever conclusion that could be drawn from model projections.

  2. Most of the mainstream climate related data used by IPCC are indeed generated from meteo data collected at land meteo stations. This has two consequences:(i) The spatial coverage of the data is highly questionable, as the temperature over the oceans, representing 70% of the Earth surface, is mostly neglected or “guestimated” by interpolation;(ii) The number and location of theses land meteo stations has considerably changed over time, inducing biases and fake trends.

  3. The key indicator used by IPCC is the global temperature anomaly, obtained by spatially averaging, as well as possible, local anomalies. Local anomalies are the comparison of present local temperature to the averaged local temperature calculated over a previous fixed reference period of 30 years, changing each 30 years (1930-1960, 1960-1990, etc.). The concept of local anomaly is highly questionable, due to the presence of poly-cyclic components in the temperature data, inducing considerable biases and false trends when the “measurement window” is shorter than at least 6 times the longest period detectable in the data; which is unfortunately the case with temperature data

  4. Linear trend lines applied to (poly-)cyclic data of period similar to the length of the time window considered, open the door to any kind of fake conclusions, if not manipulations aimed to push one political agenda or another.

  5. Consequently, it is highly recommended to abandon the concept of global temperature anomaly and to focus on unbiased local meteo data to detect an eventual change in the local climate, which is a physically meaningful concept, and which is after all what is really of importance for local people, agriculture, industry, services, business, health and welfare in general.

CO2 Is So Powerful It Can Cause Global Warming To Pause For Decades

by Joanna Nova, July 24, 2019 in ClimateChangeDispatch


It’s all so obvious. If researchers start with models that don’t work, they can find anything they look for — even abject nonsense which is the complete opposite of what the models predicted.

Holy Simulation! Let’s take this reasoning and run with it  — in the unlikely event, we actually get relentless rising temperatures, that will imply that the climate sensitivity of CO2 is lower. Can’t see that press release coming…

Nature has sunk so low these days it’s competing with The Onion.

The big problem bugging believers was that global warming paused, which no model predicted, and which remains unexplained still, despite moving goalposts, searching in data that doesn’t exist, and using error bars 17 times larger than the signal.

The immutable problem is that energy shalt not be created nor destroyed, so The Pause still matters even years after it stopped pausing.

The empty space still shows the models don’t understand the climate — CO2 was supposed to be heating the world, all day, every day.

Quadrillions of Joules have to go somewhere, they can’t just vanish, but models don’t know where they went. If we can’t explain the pause, we can’t explain the cause, and the models can’t predict anything.

In studies like these, the broken model is not a bug, it’s a mandatory requirement — if these models actually worked, it wouldn’t be as easy to produce any and every conclusion that an unskeptical scientist could hope to “be surprised” by.

The true value of this study, if any, is in 100 years time when some psychology Ph.D. student will be able to complete an extra paragraph on the 6th-dimensional flexibility of human rationalization and confirmation bias.

Busted climate models can literally prove anything. The more busted they are, the better.

La croissance du CO2 dans l’atmosphère est-elle exclusivement anthropique? (3/3)

by J.C. Maurin, 19 juillet 2019 in ScienceClimatEnergie


Effet Bombe et Modèles du GIEC

Les prévisions du climat sont générées par des modèles informatiques. Leurs concepteurs pensent pouvoir décrire l’état moyen de l’atmosphère en 2100, en prenant comme principale donnée d’entrée, le taux futur de CO2 qui constituerait donc le ‘bouton de commande’ du climat.

Il y a deux étages de modélisation : on commence par prévoir le taux de CO2 en 2100 avec des modèles sélectionnés par le GIEC (ces modèles « IRF » du GIEC sont l’objet de l’article).
Cette prévision constitue ensuite l’entrée du second étage, à savoir les modèles types « échanges radiatifs » ou « effet de serre » qui ne sont pas traités ici (mais on peut consulter ceci).
Le présent article ( qui est la suite de deux autres ici et ici) compare la réponse impulsionnelle théorique de ces modèles « IRF » avec la réponse impulsionnelle observée du 14CO2(effet Bombe).

What Humans Contribute to Atmospheric CO2: Comparison of Carbon Cycle Models with Observations

by Herman Harde, April 3, 2019 in Earth Sciences


Abstract: The Intergovernmental Panel on Climate Change assumes that the inclining atmospheric CO2 concentration over

recent years was almost exclusively determined by anthropogenic emissions, and this increase is made responsible for the rising

temperature over the Industrial Era. Due to the far reaching consequences of this assertion, in this contribution we critically

scrutinize different carbon cycle models and compare them with observations. We further contrast them with an alternative

concept, which also includes temperature dependent natural emission and absorption with an uptake rate scaling proportional

with the CO2 concentration. We show that this approach is in agreement with all observations, and under this premise not really

human activities are responsible for the observed CO2 increase and the expected temperature rise in the atmosphere, but just

opposite the temperature itself dominantly controls the CO2 increase. Therefore, not CO2 but primarily native impacts are

responsible for any observed climate changes.

Keywords: Carbon Cycle, Atmospheric CO2 Concentration, CO2 Residence Time, Anthropogenic Emissions,

Fossil Fuel Combustion, Land Use Change, Climate Change

 

Human CO2 Emissions Have Little Effect on Atmospheric CO2

by Edwin X Berry , June, 2019 in JAtmOceanSciences


Abstract
The United Nations Intergovernmental Panel on Climate Change (IPCC) agrees human CO2 is only 5 percent and natural CO2 is 95 percent of the CO2 inflow into the atmosphere. The ratio of human to natural CO2 in the atmosphere must equal the ratio of the inflows. Yet IPCC claims human CO2 has caused all the rise in atmospheric CO2 above 280 ppm, which is now 130 ppm or 32 percent of today’s atmospheric CO2. To cause the human 5 percent to become 32 percent in the atmosphere, the IPCC model treats human and natural CO2 differently, which is impossible because the molecules are identical. IPCC’s Bern model artificially traps human CO2 in the atmosphere while it lets natural CO2 flow freely out of the atmosphere. By contrast, a simple Physics Model treats all CO2 molecules the same, as it should, and shows how CO2 flows through the atmosphere and produces a balance level where outflow equals inflow. Thereafter, if inflow is constant, level remains constant. The Physics Model has only one hypothesis, that outflow is proportional to level. The Physics Model exactly replicates the 14C data from 1970 to 2014 with only two physical parameters: balance level and e-time. The 14C data trace how CO2 flows out of the atmosphere. The Physics Model shows the 14 CO2 e-time is a constant 16.5 years. Other data show e-time for 12CO2 is about 4 to 5 years. IPCC claims human CO2 reduces ocean buffer capacity. But that would increase e-time. The constant e-time proves IPCC’s claim is false. IPCC argues that the human-caused reduction of 14C and 13C in the atmosphere prove human CO2 causes all the increase in atmospheric CO2. However, numbers show these isotope data support the Physics Model and reject the IPCC model. The Physics Model shows how inflows of human and natural CO2 into the atmosphere set balance levels proportional to their inflows. Each balance level remains constant if its inflow remains constant. Continued constant CO2 emissions do not add more CO2 to the atmosphere. No CO2 accumulates in the atmosphere. Present human CO2 inflow produces a balance level of about 18 ppm. Present natural CO2inflow produces a balance level of about 392 ppm. Human CO2 is insignificant to the increase of CO2 in the atmosphere. Increased natural CO2 inflow has increased the level of CO2 in the atmosphere.

PUTTING CLIMATE CHANGE CLAIMS TO THE TEST

by John Christy, June 18, 2019 in GWPF


This is a full transcript of a talk given by Dr John Christy to the GWPF on Wednesday 8th May.

When I grew up in the world of science, science was understood as a method of finding information. You would make a claim or a hypothesis, and then test that claim against independent data. If it failed, you rejected your claim and you went back and started over again. What I’ve found today is that if someone makes a claim about the climate, and someone like me falsifies that claim, rather than rejecting it, that person tends to just yell louder that their claim is right. They don’t look at what the contrary information might say.

OK, so what are we talking about? We’re talking about how the climate responds to the emission of additional greenhouse gases caused by our combustion of fossil fuels. In terms of scale, and this is important, we want to know what the impact is on the climate, of an extra half a unit of forcing amongst total forcings that sum to over 100 units. So we’re trying to figure out what that signal is of an extra 0.5 of a unit.

Here is the most complicated chart I have tonight, and I hope it makes sense:

 

Why Climate Models Can’t Predict The Future (And Never Have)

by Jay Lehr, June 11, 2019 in Climate ChangeDispatch


SEE ALSO: Climate Models Of Incompetence

Consider the following: we do not know all the variables that control our climate, but we are quite sure they are likely in the hundreds.

Just take a quick look at ten obviously important factors for which we have limited understanding:

1- Changes in seasonal solar irradiation;

2- Energy flows between ocean and atmosphere;

3- Energy flow between air and land;

4- The balance between Earth’s water, water vapor, and ice;

5- The impacts of clouds;

6- Understanding the planet’s ice;

7- Mass changes between ice sheets, sea level and glaciers;

8- The ability to factor in hurricanes and tornadoes;

9- The impact of vegetation on temperature;

10- Tectonic movement on ocean bottoms.

Yet, today’s modelers believe they can tell you the planet’s climate decades or even a century in the future and want you to manage your economy accordingly.

Dr. Willie Soon of the Harvard-Smithsonian astrophysics laboratory once calculated that if we could know all the variables affecting climate and plugged them into the world’s largest computer, it would take 40 years for the computer to reach an answer.

Climatologist: Climate Models Are Predicting Too Much Warming

by Dr. Benny Peiser, May 23, 2019 in GWPF


A leading climatologist has said that the computer simulations that are used to predict global warming are failing on a key measure of the climate today and cannot be trusted.

Speaking to a meeting in the Palace of Westminster in London, Professor John Christy of the University of Alabama in Huntsville told MPs and peers that almost all climate models have predicted rapid warming at high altitudes in the tropics:

A paper outlining Dr. Christy’s key findings is published today by the Global Warming Policy Foundation.

Circular reasoning with climate models

by Dr. Wojick, March 1, 2018 in CFact


Climate models play a central role in the attribution of global warming or climate change to human causes. The standard argument takes the following form: “We can get the model to do X, using human causes, but not without them, so human causes must be the cause of X.” A little digging reveals that this is actually a circular argument, because the models are set up in such a way that human causes are the only way to get change.

The finding is that humans are the cause of global warming and climate change is actually the assumption going in. This is circular reasoning personified, namely conclude what you first assume.

This circularity can be clearly seen in what many consider the most authoritative scientific report on climate change going, although it is actually just the most popular alarmist report. We are talking about the Summary for Policymakers (SPM), of the latest assessment report (AR5), of the heavily politicized UN Intergovernmental Panel on Climate Change (IPCC). Their 29 page AR5 SPM is available here.

40387018 – the raging whirlpool

Global-scale multidecadal variability missing in state-of-the-art climate models

by S. Kravstov et al. 2018, in Nature


Reliability of future global warming projections depends on how well climate models reproduce the observed climate change over the twentieth century. In this regard, deviations of the model-simulated climate change from observations, such as a recent “pause” in global warming, have received considerable attention. Such decadal mismatches between model-simulated and observed climate trends are common throughout the twentieth century, and their causes are still poorly understood. Here we show that the discrepancies between the observed and simulated climate variability on decadal and longer timescale have a coherent structure suggestive of a pronounced Global Multidecadal Oscillation. Surface temperature anomalies associated with this variability originate in the North Atlantic and spread out to the Pacific and Southern oceans and Antarctica, with Arctic following suit in about 25–35 years. While climate models exhibit various levels of decadal climate variability and some regional similarities to observations, none of the model simulations considered match the observed signal in terms of its magnitude, spatial patterns and their sequential time development. These results highlight a substantial degree of uncertainty in our interpretation of the observed climate change using current generation of climate models.

Reassessing the RCPs

by Kevin Murphy in Judith Curry, January 28, 2019 in ClimateEtc.


A response to: “Is RCP8.5 an impossible scenario?”. This post demonstrates that RCP8.5 is so highly improbable that it should be dismissed from consideration, and thereby draws into question the validity of RCP8.5-based assertions such as those made in the Fourth National Climate Assessment from the U.S. Global Change Research Program.

Analyses of future climate change since the IPCC’s 5th Assessment Report (AR5) have been based on representative concentration pathways (RCPs) that detail how a range of future climate forcings might evolve.

Several years ago, a set of RCPs were requested by the climate modeling research community to span the range of net forcing from 2.6 W/m2 to 8.5 W/m2 (in year 2100 relative to 1750) so that physics within the models could be fully exercised. Four of them were developed and designated as RCP2.6, RCP4.5, RCP6.0 and RCP8.5. They have been used in ongoing research and as the basis for impact analyses and future climate projections.

Figure 2. History and forecasts of CO2 concentration. RCP8.5 is defined by 936 ppm in 2100.

Simplest climate model yet – a bathtub

by Charles the moderator, January 18, 2019 in WUWT


Climate change: How could artificial photosynthesis contribute to limiting global warming?

Scientists calculate areas needed for forestation and artificial photosynthesis.

After several years during which global emissions at least stagnated, they rose again somewhat in 2017 and 2018. Germany has also clearly missed its climate targets. In order to keep global warming below 2 degrees Celsius, only about 1100 gigatonnes of CO2 may be released into the atmosphere by 2050[1]. And In order to limit global warming to 1.5 degrees, only just under 400 gigatonnes of CO2 may be emitted worldwide. By 2050, emissions will have to fall to zero even. Currently, however, 42 gigatonnes of CO2 are added every year.

Almost all the various scenarios require “negative emissions”

Regional Models: 3-10°C Warming In The Next 80 Years. Observations: No Warming In The Last 40-100 Years.

by K. Richard, January 14, 2019 in NoTricksZone


There are large regions of the globe where observations indicate there has been no warming (even cooling) during the last decades to century. Climate models rooted in the assumption that fossil fuel emissions drive dangerous warming dismiss these modeling failures and project temperature increases of 3° – 10°C by 2100 for these same regions anyway.

Image Source: Partridge et al., 2018

Evolutions récentes du CO2 atmosphérique (3/4)

by J.C. Maurin, 12 novembre 2018 in ScienceClimatEnergie


L’IPCC (GIEC en français) fut créé en 1988 par l’UNEP (United Nations Environment Programme) et le WMO (World Meteorological Organization). Dans les principes régissant les travaux du GIEC (1) on lit : Le GIEC a pour mission d’évaluer … les risques liés au changement climatique d’origine humaine.  Le GIEC respecte son propre principe fondateur : il attribue l’intégralité de la hausse du taux de CO2 depuis 1958 à une cause anthropique. Nous examinerons ici le modèle anthropique du GIEC et nous le confronterons aux mesures contemporaines, puis à un modèle mixte. Cet article fait suite aux deux précédents publiés sur le site SCE au cours des mois de septembre (1/4) et octobre 2018 (2/4).

C.   Modèle anthropique GIEC

C.1   Les contraintes des modèles (Fig. 1)

Le paragraphe A (article 1/4) a montré qu’en 1980 le taux de CO2 atmosphérique était de 338 ppm et le  δ13C de -7.6 ‰. En  2010 le taux de CO2  atmosphérique était de 388 ppm et le δ13C de -8.3 ‰. Il existe une modulation annuelle de ce taux, très marquée dans l’hémisphère Nord.

 

Interview exclusive: Henri Masson, Université d’Anvers, déclare les modèles du GIEC « aberration statistique »

by Henri Masson, 10 mars 2012, in Contrepoints


Des modèles, cela fait 40 ans que j’en fais », précise d’emblée Henri Masson. Ingénieur chimiste de formation (Université Libre de Bruxelles), docteur en sciences appliquées, professeur émérite à l’Université d’Anvers, expert globe-trotter (notamment pour la Banque Mondiale et l’ONU), l’homme est, de surcroît, doté d’un sérieux sens de la vulgarisation. Lorsque Contrepoints lui propose d’analyser les modèles prédictifs du GIEC, le Belge est catégorique : « Si mes étudiants me présentaient de tels modèles, je n’hésiterais pas à les recaler ! »

Contrepoints : Quelle confiance peut-on accorder aux modèles du GIEC, qui prévoient, parmi d’autres choses, un réchauffement planétaire dû aux émissions humaines de CO2 ?

Evolutions récentes du CO2 atmosphérique (3/4)

by J.C. Maurin, 12 novembre 2018 in  ScienceClimatEnergie


L’IPCC (GIEC en français) fut créé en 1988 par l’UNEP (United Nations Environment Programme) et le WMO (World Meteorological Organization). Dans les principes régissant les travaux du GIEC (1) on lit : Le GIEC a pour mission d’évaluer … les risques liés au changement climatique d’origine humaine.  Le GIEC respecte son propre principe fondateur : il attribue l’intégralité de la hausse du taux de CO2 depuis 1958 à une cause anthropique. Nous examinerons ici le modèle anthropique du GIEC et nous le confronterons aux mesures contemporaines, puis à un modèle mixte. Cet article fait suite aux deux précédents publiés sur le site SCE au cours des mois de septembre (1/4) et octobre 2018 (2/4).

C.4.  Conclusions

  • Un modèle qui décrit un monde fixe, en équilibre, un modèle où l’homme est central, un modèle qui parvient à reproduire certaines observations mais pas toutes, un modèle unanimement soutenu par les autorités politiques ou morales, enfin un modèle qui pose a priori un principe intangible… est le type même de modèle qui fut développé  par Ptolémée (6) pour le système solaire. Ce modèle fut jadis l’objet d’un consensus  à  > 97%.

  • L’atmosphère actuelle comporte environ 20 ppm de CO2 anthropique correspondant à 20/400 soit 5% du CO2 atmosphérique. En un siècle les hommes ont donc modifié la composition de l’atmosphère de 20 ppm soit 0,002% : sur ce sujet également, il semble que nous ne soyons pas au centre du monde.

  • Les évolutions récentes du CO2 atmosphérique ne peuvent pas avoir une cause uniquement anthropique: les observations du δ13C l’interdisent. Les causes sont anthropiques et naturelles. Le modèle purement anthropique du GIEC est donc à rejeter.

At IPCC talks Trump Administration emphasizes scientific “uncertainty” and “value of fossil fuels”… MAGA!

by David Middleton, October 4, 2018 in WUWT


95% of the model runs predicted more warming than the RSS data since 1988… And this is the Mears-ized RSS data, the one in which the measurements were influenced to obtain key information (erase the pause and more closely match the surface data).

Their “small discrepancy” would be abject failure in the oil & gas industry.

The observed warming has been less than that expected in a strong mitigation scenario (RCP4.5).

Output of 38 RCP4.5 models vs observations.   The graph is originally from Carbon Brief.  I updated it with HadCRUT4, shifted to 1970-2000 baseline, to demonstrate the post-El Niño divergence.

The ‘Trick’ of Anomalous Temperature Anomalies

by Kip Hansen, September 25, 2018 in WUWT


It seems that every time  we turn around, we are presented with a new Science Fact that such-and-so metric — Sea Level Rise, Global Average Surface Temperature, Ocean Heat Content, Polar Bear populations, Puffin populations — has changed dramatically — “It’s unprecedented!” — and these statements are often backed by a graph illustrating the sharp rise (or, in other cases, sharp fall) as the anomaly of the metric from some baseline.  In most cases, the anomaly is actually very small and the change is magnified by cranking up the y-axis to make this very small change appear to be a steep rise (or fall).

A Test of the Tropical 200- to 300-hPaWarming Rate in Climate Models

by R. McKitrick and J. Christy, July 6, 2018 in AGU100


Abstract
Overall climate sensitivity to CO2
doubling in a general circulation model results from a complex
system of parameterizations in combination with the underlying model structure. We refer to this as the modelsmajor hypothesis, and we assume it to be testable. We explain four criteria that a valid test should meet: measurability, specificity, independence, and uniqueness. We argue that temperature change in the
tropical 200- to 300-hPa layer meets these criteria. Comparing modeled to observed trends over the past
60 years using a persistence-robust variance estimator shows that all models warm more rapidly than
observations and in the majority of individual cases the discrepancy is statistically significant. We argue that
this provides informative evidence against the major hypothesis in most current climate models.

Weather and Climate in the Real World

by Tim Ball, August 18, 2018 in WUWT


All the trillions of dollars spent on AGW have not improved forecasting one bit. Instead, it diverted money that could have helped those large, primary sectors of society and economy that need better and more appropriate information. It is time to close all government weather offices or at least reduce their function to data collection determined by the end users.

Why does climate sensitivity increase over time in models? A look at two possibilities

by A. Zaragoza Comendador, August 16, 2018 in WUWT


Note: if the terms used in this article seem confusing, check out the previous one.

Introduction

It’s well known that climate models show increasing sensitivity over time: for a given forcing, the true long-term temperature increase (ECS) is higher than what you’d estimate if you simply extrapolated from the past (ECS_hist). In other words, the ECS-to-ECS_hist ratio is above 1. This article tries to work out why climate models behave like that; that is to say, the variable I’m trying to explain is the ECS-to-ECS_hist ratio.

Now, there’s probably too many hyphens and underscores in the text. So it will be more readable if I clarify that, every time I talk simply about ‘correlation’, I mean the correlation of thing X with the ECS-to-ECS_hist ratio. If other kind of correlation is mentioned, I’ll say so explicitly.

The Major Change in the Global Warming Groupthink Between 1990 and 1995

byTim Ball, August 12, 2018 in WUWT


Somebody said economists try to predict the tide by measuring one wave. This puts them in the same league as climate scientists trying to predict the climate by measuring one variable, CO2. It is no surprise that an amalgam of the two, climate and economics, produces even worse results, but that is what happened early in the anthropogenic global warming (AGW) deception.

(…)

A Global Warming Hiatus in Northeast China

by Sun X. et al., 2018 in CO2Science


Paper Reviewed
Sun, X., Ren, G., Ren, Y., Fang, Y., Liu, Y., Xue, X. and Zhang, P. 2018. A remarkable climate warming hiatus over northeast China since 1998. Theoretical and Applied Climatology 133: 579-594.

A prominent feature of all climate model projections is their prediction that temperatures should be rising in response to ever-increasing concentrations of greenhouse gases. However, for the past two decades global surface air temperatures have not warmed to the degree predicted by the models, which lack of warming has been a conundrum to the climate alarmist movement.

108 Graphs From 89 New Papers Invalidate Claims Of Unprecedented Global-Scale Modern Warmth

by K. Richard, August 2, 2018 in NoTrickZone


During 2017, there were 150 graphs from 122 scientific papers published in peer-reviewed journals indicating modern temperatures are not unprecedented, unusual, or hockey-stick-shaped — nor do they fall outside the range of natural variability.  We are a little over halfway through 2018 and already  108 graphs from 89 scientific papers undermine claims that modern era warming is climatically unusual.

For the sake of brevity, just 13 (15%) of the 89 new papers are displayed below.

The rest of the non-hockey-stick scientific papers and graphs published thus far in 2018 can be viewed by clicking the link below.