A new study indicates nearly all the Northern Hemisphere and Tropical warming in the last 40 years occurred by the late 1990s.
CO2 has risen by about 50 ppm since 1998 (367 to 418 ppm).
Interestingly, upper-air measurements of temperature from balloon-borne sensor radiosonde data, shown below in the image from a new study (Madonna et al., 2022), suggest there was more warming from the early 1980s to late 1990s – when CO2 only rose about 25 ppm (341 to 367 ppm) – than there has been this century.
Radiosonde measurements appear to depict mostly flat temperatures trends since 1998 in both the Northern Hemisphere (25°N to 70°N) and tropics (25°S to 25°N).
According to Dieng et al., 2017, global sea surface temperatures (SST) cooled slightly (-0.006°C/decade) from 2003 to 2013. This reduced the overall 1950-2014 warming rate to 0.059°C per decade.
Sea and land surface temperatures, ocean heat content, Earth’s
energy imbalance and net radiative forcing over the recent years.
The NCAR/HadCRUT4 global SST record from buoys and ARGO floats also show only modest warming in the last 3 decades. The natural 2015-’16 Super El Nino event is mostly responsible for the overall increasing rate.
This is the fourth in a series of articles on the IPCC’s AR6 WG1 report. –CCD ed.
Margaret Thatcher helped create the United Nations Intergovernmental Panel on Climate Change (IPCC) in 1988. As an Oxford-trained chemist, she understood scientific principles and was concerned that we “… do not live at the expense of future generations.”
By 2002, the Iron Lady turned against global warming extremism by stating in her book Statecraft: Strategies for a Changing World, “What is far more apparent is that the usual suspects on the left have been exaggerating the dangers and simplifying solutions in order to press their agenda…” [bold, links added]
Thatcher’s comments of exaggeration and simplification were a prescient critique of the IPCC report Climate Change 2021: The Physical Sciences Basis.
The IPCC uses computer simulations to predict climate dangers and test solutions. An important step in the computer simulation of a real-world physical process is making sure the simulator can replicate the known history of that physical process.
If a computer model can accurately replicate a significant history of a known process, called hindcasting, it lends credibility that the correct equations are being used and will be able to predict future events.
Does anyone wonder where all the global warming destruction is? After all, the media are unrelenting in telling us how much climate change caused by man is affecting us. Yet no existential threat has emerged. There’s something off with the story.
The climate alarmists have based their predictions of doom on computer models that have been projecting global temperature increases, the likes of which, they tell us, are unsustainable. We must cut our carbon dioxide emissions, even if (actually, especially if) it hurts developed world economies.
This is the narrative we’re bombarded with on a daily basis. And it’s wrong.
Those models that have been used to fuel the fright are, without a doubt, unreliable. According to a recent story published in Nature magazine written by a group of climate modelers, “a subset of the newest generation of models are ‘too hot’ and project climate warming in response to carbon dioxide emissions that might be larger than that supported by other evidence.”
The authors, though, are careful to preserve the narrative, warning that “whereas unduly hot outcomes might be unlikely, this does not mean that global warming is not a serious threat.” They can’t help themselves.
While the modelers in the Nature article point specifically to problems with “a subset of the newest generation of models,” it’s obvious that the older models are no better. Last fall we covered a ScienceDaily report which noted that some researchers had concluded “a possible flaw in climate models” had been exposed, as the models failed to reproduce an observed event.
“When the history of climate modeling comes to be written in some distant future,” economist Robert L. Bradley Jr. wrote some months ago for the American Institute for Economic Research, “the major story may well be how the easy, computable answer turned out to be the wrong one, resulting in overestimated warming and false scares from the enhanced (man-made) greenhouse effect.”
Yes, I do know that acceleration, technically, means just a change in velocity. But, in every day English, we use acceleration to mean an increase in velocity – speeding up — and deceleration as a decrease in velocity – slowing down. I mention acceleration and deceleration because one of the major talking points of IPCC reported findings about sea level rise, the incessant media mantra, is that “Sea Level Rise is Accelerating”. (here, here, here, here, here and hundreds more here)
Is sea level rising? Yes, of course it is. It has been rising since about 1750-1775, coinciding with the end of the Little Ice Age. This is widely accepted as shown below:
There’s a few things to note at first glance. The ice floe continued to decrease in thickness into November. It’s thickness then started to increase, but is currently still less than 2 meters. Also the snow depth has gradually been increasing, and (apart from some data glitches!) is now ~38 cm. Finally, for the moment at least, the ice surface temperature has been slowly warming since mid February and is now ~-11 °C.
Every year, tropical hurricanes affect North and Central American wildlife and people. The ability to forecast hurricanes is essential in order to minimize the risks and vulnerabilities in North and Central America. Machine learning is a newly tool that has been applied to make predictions about different phenomena. We present an original framework utilizing Machine Learning with the purpose of developing models that give insights into the complex relationship between the land–atmosphere–ocean system and tropical hurricanes. We study the activity variations in each Atlantic hurricane category as tabulated and classified by NOAA from 1950 to 2021. By applying wavelet analysis, we find that category 2–4 hurricanes formed during the positive phase of the quasi-quinquennial oscillation. In addition, our wavelet analyses show that super Atlantic hurricanes of category 5 strength were formed only during the positive phase of the decadal oscillation. The patterns obtained for each Atlantic hurricane category, clustered historical hurricane records in high and null tropical hurricane activity seasons. Using the observational patterns obtained by wavelet analysis, we created a long-term probabilistic Bayesian Machine Learning forecast for each of the Atlantic hurricane categories. Our results imply that if all such natural activity patterns and the tendencies for Atlantic hurricanes continue and persist, the next groups of hurricanes over the Atlantic basin will begin between 2023 ± 1 and 2025 ± 1, 2023 ± 1 and 2025 ± 1, 2025 ± 1 and 2028 ± 1, 2026 ± 2 and 2031 ± 3, for hurricane strength categories 2 to 5, respectively. Our results further point out that in the case of the super hurricanes of the Atlantic of category 5, they develop in five geographic areas with hot deep waters that are rather very well defined: (I) the east coast of the United States, (II) the Northeast of Mexico, (III) the Caribbean Sea, (IV) the Central American coast, and (V) the north of the Greater Antilles.
This week, I’m grateful for the opportunity to participate in a conference in Brussels on “science advice under pressure,” organized by the European Commission’s Science Advisory Mechanism (it is streaming online if you’d like to join in today and tomorrow). I am on a panel today with Anne Glover (former science advisor to the European Commission), Matthew Flinders (University of Sheffield) and Lara Pivodic (Vrije Universiteit Brussels). Our moderator has asked us to begin today’s conversation by answering the following question:
What are your experiences (either personal or among colleagues) of coming under pressure and facing hostility a result of being a prominent science advisor giving advice in public?
As I have considered this question, my first response was: Have a seat, grab a cup of coffee, and how much time do you have?
While sea ice has been declining off the Greenland Sea (east of the island), the Chucki Sea (eastern Siberia) shows a very different trend in sea ice extent over the past year. Such deviations have occurred repeatedly since the year 2000.
Overall, the 2021 extent was very close to the 1991-2020 mean and well above the lowest value in 2012 and also above what was recorded in the year 2020
The general view in society is that human emissions of CO₂ are the all-determining cause of the increased concentration in the atmosphere. Most scientists and even many climate skeptics do not question this. There is some debate about how long this extra CO₂ will stay in the atmosphere, but that’s about it. That’s remarkable, as several scientists have published extensively on the flaws and inconsistencies of this narrative. By looking at the significant increase in the CO₂-flows from and to land and sea it’s in fact easy to see that the CO₂-rise is largely due to natural causes.
The idea that human CO₂ is the all-determining cause of the increased concentration is based on the assumption that the natural inflows and outflows are always and exactly in equilibrium with each other. Based on this perfect equilibrium thinking, human emissions, even though they are relatively small, cause a perturbation year after year. In the so-called global carbon budget about 10 PgC of CO₂ is added every year, while the absorption flux has only increased by 6 PgC/yr (1 Petagram = 1 Gigaton = 1 billion tons). The concentration therefore continues to rise indefinitely as long as people emit CO₂.
To support this idea it is also assumed that human emissions accumulate in the atmosphere. Where you would expect a single residence time for a reservoir with in- and outflows, the IPCC-models calculate with a small residence time of about 4 years for natural CO₂ and a large one for human CO₂: “The removal of all the human-emitted CO2 from the atmosphere by natural processes will take a few hundred thousand years (high confidence)”.
Several scientists, including Murray Salby and Hermann Harde, have published extensively on the flaws and inconsistencies of this narrative. They also showed that it is very illogical to think that a slight increase in the up-flux cannot be compensated by a larger down-flux. It’s like increasing the heat energy flow in a house by 5% and expecting that the temperature will keep on rising forever.
Despite this, belief in the IPCC’s model for the increase in concentration is persistent. In this article we will focus on one of the strangest assumptions: the idea that the in- and outflows are stable and in perfect equilibrium. Although they are about 20 times larger than anthropogenic fluxes and have different drivers for up and down, natural flows are not included in the material balance used in the models.
It is in fact easy to see that the increase in the CO₂ concentration is for the most part the result of natural changes, based on the following unmistakable observations.
Fluxes to and from land and sea have increased significantly since 1750.
The increase in these fluxes is natural, i.e. not due to human emissions.
The growth of the natural fluxes can only take place at a higher concentration in the atmosphere.
A new study published in Geophysical Research Lettershighlights the abysmal model performance manifested in the latest Intergovernmental Panel on Climate Change report (AR6). The 38 CMIP6 general circulation models (GCMs) fail to adequately simulate even the most recent (1980-2021) warming patterns over 60 to 81% of the Earth’s surface.
Dr. Scafetta places particular emphasis on the poor performance of the highly uncertain estimates (somewhere between 1.83 and 5.67°C) of equilibrium climate sensitivity (ECS) and their data-model agreement relative to 1980-2021 global warming patterns.
The worst-performing ECS estimates are the ones projecting 3-4.5°C and 4.5-6°C warming in response to doubled CO2 concentrations (to 560 ppm) plus feedbacks, as the 1980-2021 temperature trends are nowhere close to aligning with these trajectories.
Instead, the projected global warming by 2050 (~2°C relative to 1750) associated with the lowest ECS estimates and implied by the warming observed over the last 40+ years is characterized as “unalarming” even with the most extreme greenhouse gas emissions (no mitigation efforts undertaken) growth rate.
In addition to the conclusion that “no model group succeeds reproducing observed surface warming patterns,” poor modeling of heat transfer physics, ocean and atmospheric circulation patterns, polar sea ice processes…is also evident in the latest IPCC report.
“Accurately reproducing regional temperature differences over the past 40+ years is beyond the capability of climate model simulations, and even fails for major ocean basins and continents.”
The fundamental modeling failures in simulating responses to sharply rising greenhouse gas emissions over the last 40+ years “calls into question model-based attribution of climate responses to anthropogenic forcing.”
A 2021 study appearing in Nature Communications by Buentgen et al reports on the results of a double-blind experiment of 15 different groups that yielded 15 different Northern Hemisphere summer temperature reconstructions. Each group used the same network of regional tree-ring width datasets.
What’s fascinating is that ll groups, though using the same data network, came up with a different result. When it comes to deriving temperatures from tree-rings, it has much to do with individual approach and interpretation. Sure we can follow the science, but whose results?
The 15 groups (referred to as R1–R15) were challenged with the same task of developing the most reliable NH summer temperature reconstruction for the Common Era from nine high-elevation/high-latitude TRW datasets (Fig. 1):Cropped from Figure 1, Buentgen et al
Air temperatures 4Surface: spatial pattern 4 Lower Troposphere: monthly 6 Lower Troposphere: annual means 7 Surface: monthly 8 Surface: annual means 10 Error, consistency and quality 11 Surface versus lower Troposphere 14Lower Troposphere: land versus ocean 15 By altitude 16 Zonal air temperatures 17 Polar air temperatures 18
The claimed warming rate during the (1998-2001 to 2012-’13) “hiatus” ranged from -0.07°C to +0.17°C per decade.
In late 2012, the IPCC had an ongoing dilemma about what to do about the uncooperative global temperatures. The HadCRUT3 data set government bureaucrats had been using since the first report in 1990 actually showed the global mean surface temperatures had been declining since 1998. This was not going further the we-must-act-on-global-warming-now narrative, of course.
Enter Phil Jones, the global temperature data set overseer at East Anglia’s Climate Research Unit (CRUTEM). He’s the scientist who famously admitted that when the temperature data doesn’t exist, they are “mostly made up.”
Jones’s CRU and the Met Office (Hadley) then jointly constructed the newer HadCRUT4 version to help advance the narrative. This version changed the data just in time for the 5th IPCC assessment (AR5, 2013). The 1998-2001 temperatures were allowed to stay the same, but an additional 0.1 to 0.2°C was tacked on to anomalies from 2002 onwards. The effect was to transform the 1998-2012 slight cooling in HadCRUT3 into a 0.04°C per decade−1 warming in HadCRUT4.
The Climate Feedback website critiques my CO2Coalition article “Attributing global warming to humans.”Their factcheck is here. Like most “fact checks” these days it is a thinly disguised opinion piece. The statement that they claim is incorrect is:
“There is no evidence, other than models, that human CO2 emissions drive climate change and abundant evidence that the Sun, coupled with natural climate cycles, drives most, if not all, of recent climate changes, as described in Connolly, et al., 2021.” [emphasis added]
They cleverly leave out the last phrase: “as described in Connolly, et al., 2021,” and then immediately assert “Solar irradiance has had a negligible impact on Earth’s climate since the industrial era.” This is followed by no evidence other than an appeal to the mythical “consensus.”
Later in the article, they say Connolly, et al. uses simple linear regression to establish a link between solar irradiance and surface temperature. Connolly, et al. does not state that the Sun controls the climate or that humans do, it simply shows that, using available evidence, solar variability (actually TSI, or Total Solar Irradiance variability) could account for anywhere from 0 to 100% of the warming since the Little Ice Age (the so-called “pre-industrial” era). One of the main points of Connolly, et al. is that the IPCC and the so-called “consensus” are ignoring two critical areas of current research. First, they ignore the uncertainty in our estimate of surface warming since the Little Ice Age, and second, they ignore the considerable uncertainty in solar-variability-long-term trends, both recently and since the Little Ice Age. As they state in the paper, the amount of 20th century warming that can be simulated as due to solar variability, depends upon the surface temperature dataset and the solar TSI model used. There are many versions of both. Suffice it to say, while the exact influence of human activities and solar variability on climate change are both unknown, no one can claim solar influence is negligible. The correct answer is we don’t know.
It was supposed to be a groundbreaking forecast, the early prediction of the weather phenomena El Niño and La Niña. Both affect the weather in very different ways.
It would have been so nice to know a year in advance what conditions would prevail at a later date. On November 4, 2019, the Potsdam Institute for Climate Impact Research published what was held as groundbreaking news. Thanks to its new algorithms and a lot of computing power, it was now possible to predict an El Niño or a La Niña a long time in advance. The hit rate was supposed to be 80%.
Unfortunately, one year later exactly the opposite of what was predicted in fact happened, the German Klimaschau reported. Science is settled? Well, maybe not.
Since then, things have been quiet about these PIK long-term forecasts. The US agency NOAA is much more cautious, both in terms of the long term and the probability of occurrence. Perhaps they don’t have as good algorithms and the computing power that Potsdam has? In any case on Twitter, US meteorologist Ryan Maue sees a good chance that a third will follow the two recent consecutive La Niñas.
The last time this happened was 22 years ago at the turn of the millennium.
Guest post by Christopher Essex, Emeritus Professor of Mathematics and Physics, University of Western Ontario.
By Dr Christopher Essex
It is well known that daytime winter temperatures on Earth can fall well below -4°F (-20℃ ) in some places, even in midlatitudes, despite warming worries. Sometimes the surface can even drop below -40°F (-40℃ ), which is comparable to the surface of Mars. What is not so well known is that such cold winter days are colder than they would be with no atmosphere at all!
How can that be if the atmosphere is like a blanket, according to the standard greenhouse analogy? If the greenhouse analogy fails, what is climate?
Climate computer models in the 1960s could not account for this non-greenhouse-like picture. However modern computer models are better than those old models, but the climate implications of an atmosphere that cools as well as warms has not been embraced. Will computer models be able to predict climate after it is? The meteorological program for climate has been underway for more than 40 years. How did it do?
Feynman, Experiment and Climate Models “Model” is used in a peculiar manner in the climate field. In other fields, models are usually formulated so that they can be found false in the face of evidence. From fundamental physics (the Standard Model) to star formation, a model is meant to be put to the test, no matter how meritorious.
The average sea ice cover at the end of March is the metric used to compare ‘winter’ ice to previous years or decades, not the single-day date of ‘most’ ice. This year, March ended with 14.6 mkm2 of sea ice, most of which (but not all) is critical polar bear habitat. Ice charts showing this are below.
But note that ice over Hudson Bay, which is an almost-enclosed sea used by thousands of polar bears at this time of year, tends to continue to thicken from March into May: these two charts for 2020 show medium green becoming dark green, indicating ice >1.2 m thick, even as some areas of open water appear.
If you can afford a Tesla, you probably find it hard to imagine that there are some 3.5 billion people on Earth who have no reasonably reliable access to electricity.
Even less obvious may be the way rich countries’ pursuit of carbon neutrality at almost any cost limits economic opportunities for the world’s poor and poses serious geopolitical risks to the West. [bold, links added]
Anyone on an investment committee has likely spent untold amounts of time discussing ways to mitigate the impact of climate change, but they’ve likely never heard anyone state one simple and incontrovertible fact: The widespread exploration and production of fossil fuels that started in Titusville, Pa., not quite 170 years ago has done more to benefit the lives of ordinary people than any other technological advance in history.
Before fossil fuels, people relied on burning biomass, such as timber or manure, which was a far dirtier and much less efficient source of energy.
Fossil fuels let people heat their homes in the winter, reducing the risk of death from exposure. Fossil-fuel-based fertilizers greatly increased crop yields, reducing starvation and malnutrition.
Before the advent of the automobile, the ability for many people to venture far from their hometown was an unfathomable dream.
Oil- and coal-burning transportation opened up access to education, commerce, professional opportunities, and vital services such as medicine.
There has been, and remains, a strong correlation between the use of fossil fuels and life expectancy.
America’s Executive and Legislative Branches are full of ignorant politicians who need help from a 5th-grader. By the 5th grade, students have already learned that all animals and fungi consume oxygen (O2) and release carbon dioxide (CO2);conversely, all plants consume CO2 and expel O2. This is the Circle of Life; without it, our planet would be only a rock in this solar system.
In 2019 and 2020, America became energy-independent — for the first time since the mid-1950s. This meant greater amounts of crude oil and petroleum products were exported than imported. Our economy was growing beautifully, and unemployment rates were the lowest in more than 50 years. On 20 January 2021, this was abruptly reversed with the stroke of Biden’s pen (executive order 13807 revoked, plus EO13990, EO14008 and EO14030) — which incomprehensibly made America energy-dependent once again, and has also caused this unwanted inflation.
Biden’s “climate plan” includes goals to transition from fossil fuels to “clean energy,” cut emissions from electric power to zero by 2035, and reach “net-zero CO2 emissions” by 2050. However, “clean energy” (solar and wind) is unreliable and does not even provide 10% of America’s energy needs. Biden’s entire house-of-cards is based on becoming “carbon-neutral” — because CO2 is viewed as “the cause of global warming,” which is claimed to be the “greatest existential threat to mankind.”
How silly is this? CO2 — along with O2, nitrogen (N2) and water vapor (H2O) — is necessary for all Life on Earth. Geological studies indicate that CO2 levels have been as high as ~10,000-15,000 parts-per-million (ppm). This was during the Cambrian Period (~541 to 485 million years ago), long before mammals existed; at that time, plant life flourished.
Ice-core data (during the past 800,000 years) have shown cycles of CO2 ranging between ~150-180 ppm during Glacial Periods, and ~280-310 ppm during Inter-Glacial Periods. Earth ascended from its last Glacial Period ~11,500 years ago.
Warming and cooling oceans are the likely reason for these CO2 oscillations. Atmospheric CO2 has risen from ~280 ppm in 1850 (end of the Little Ice Age) to ~410 ppm today. Thus, current levels of ~410 ppm (i.e., 100 ppm more than 310 ppm) most likely reflect the burning of fossil fuels. However, rising CO2 levels in this last century have substantially improved crop growth.
The new Pause has lengthened by another month. On the UAH satellite monthly global mean lower-troposphere temperature dataset, seven and a half years have passed since there was any trend in global warming at all. As always, if anyone has seen this surely not uninteresting fact mentioned in the Marxstream news media, let us know in comments. One of the best-kept secrets in what passes for “journalism” these days is that global temperature has not been rising steadily (or, since October 2014, at all). It has been rising in occasional spurts in response to natural events such as the great Pacific shift of 1976 and the subsequent strong el Niño events, rather than at the somewhat steadier rate that one might expect if our continuing – and continuous – sins of emission were the primary culprit.
The media was full of disaster headlines this week over an observed breakup of a minor and little-known Antarctic ice shelf. Yahoo News and the New York Times, among others, lamented the “unprecedented” event. The problem is, we don’t really have any knowledge of previous events, making the present day claims false by omission.
During that period the Antarctic sea ice will in fact refreeze just like it does every year. You can be almost certain that if the Glenzer-Conger ice shelf forms again from the fragments and new ice, we won’t see MSM headlines about it because it goes against the “climate change” narrative.
The West Antarctic Ice Sheet overlies the West Antarctic Rift System about which, due to the comprehensive ice cover, we have only limited and sporadic knowledge of volcanic activity and its extent. Improving our understanding of subglacial volcanic activity across the province is important both for helping to constrain how volcanism and rifting may have influenced ice-sheet growth and decay over previous glacial cycles, and in light of concerns over whether enhanced geothermal heat fluxes and subglacial melting may contribute to instability of the West Antarctic Ice Sheet. Here, we use ice-sheet bed-elevation data to locate individual conical edifices protruding upwards into the ice across West Antarctica, and we propose that these edifices represent subglacial volcanoes. We used aeromagnetic, aerogravity, satellite imagery and databases of confirmed volcanoes to support this interpretation. The overall result presented here constitutes a first inventory of West Antarctica’s subglacial volcanism. We identified 138 volcanoes, 91 of which have not previously been identified, and which are widely distributed throughout the deep basins of West Antarctica, but are especially concentrated and orientated along the >3000 km central axis of the West Antarctic Rift System.
La géologie, une science plus que passionnante … et diverse