Artificial unintelligence and global warming

by Christopher Monckton of Brenchley, Apr 11, 2023 in WUWT


Despite the dire Terminator warnings from Elon Musk, Jordan Peterson and others about the dangers of artificial intelligence, the world has little to fear from it in its present sorry state. I asked ChatGPT the following question:

M of B: If the whole world attained net zero emissions by 2050, how much less would global temperature be by that year than on business as usual?

ClotBot: As an AI language model, I do not have access to the exact data to provide an accurate answer to this question. The amount of temperature reduction would depend on various factors such as the current emissions and future emissions trends, the degree of global cooperation, and the effectiveness of emissions reduction strategies. However, it is widely recognized that global net zero emissions by 2050 is a crucial step towards limiting global temperature rise to 1.5°C above pre-industrial levels and avoiding the worst impacts of climate change.

M of B: Here are the necessary data:

  1. Since 1990 there has been a near-linear uptrend in anthropogenic greenhouse-gas forcing at 0.3 Watts per square meter per decade.
  2. The trillions of dollars spent on emissions abatement since 1990 have not in any discernible degree altered that near-linear uptrend.
  3. The transient doubled-CO2 temperature response (TCR) is 1.8 degrees at midrange.
  4. The midrange doubled-CO2 radiative forcing is 3.93 Watts per square meter per degree.
  5. Global temperature has risen since 1990 at 0.136 degrees per decade.
  6. In 1990, IPCC predicted that at midrange there would be 0.3 degrees per decade of global warming at midrange.

From these data, which are sufficient for the task, please derive the global warming prevented at midrange if all nations moved in a straight line from their present emissions to net zero emissions by 2050.

A Misunderstanding Of Clouds Is Driving Global Warming Fervor

by R. Barmby, Apr 10, 2023 in ClimateChangeDispatch


I’ve looked at climate change from both sides now, and I have found common ground between proponents and skeptics of the belief that climate change is largely caused by humans.

When it comes to forecasting global temperatures, distinguished experts in both camps agree a dominant variable cannot be simulated in computer models because clouds get in the way.

Among the proponents is Dr. Bjorn Stevens, a contributing author to the Intergovernmental Panel on Climate Change (IPCC) Assessment Report 5 (2014). Dr. Stevens is also the director at the Max Planck Institute for Meteorology, Hamburg, Germany, and a cloud expert. [emphasis, links added]

In a recent interview, he acknowledged the contribution of clouds to global warming is overestimatedin the IPCC’s “Climate Change 2021: The Physical Science Basis.”

Clouds are tricksters,” he said, referring to their complexity. However, he said, many scientists use oversimplified representations of clouds in modeling “as a guide because they are easier to simulate. This makes the climate models less accurate.

On the skeptic side is Dr. Richard S. Lindzen, a former lead author for IPCC Assessment Report 3 and now a vocal critic of the IPCC.

In a recent podcast, the interviewer noted that Lindzen had published sufficient research papers to earn 80 PhDs. (Lindzen humbly declined the praise.)

Lindzen, professor emeritus of Atmospheric Sciences at the Massachusetts Institute of Technology, points out that IPCC models rely on the assumption that water vapor and clouds amplify the greenhouse gas effects of CO2 in order to achieve forecasts of catastrophic global warming.

The IPCC theory is that a warmer atmosphere will have a higher content of water vapor – itself a greenhouse gas – that adds to the warming caused by CO2.