NYT Falsely Blames Climate Change For Hurricane Erin Despite No Evidence

by L. Lueken, Aug 28, 2025 in ClimateChangeDispatch


Rapid intensification of Hurricane Erin isn’t unusual, and attribution studies don’t prove climate change caused it.

The New York Times (NYT) published an article titled “How Climate Change Affects Hurricanes Like Erin,” in which they rely on rapid attribution analysis to claim that climate change is making rapidly intensifying hurricanes more likely, implying that the storm was worsened by global warming. This is false. [emphasis, links added]

Attribution studies are generally not based on solid scientific evidence and, therefore, not provable. Plus, there is a lack of evidence to support the notion that rapid intensification is becoming more common.

At the outset, the NYT claimed that hurricane Erin’s effects, such as they are, “are made worse by global warming,” even though the storm stayed offshore. The storm intensified quickly from a Category 1 to a Category 5 hurricane, and NYT claims that “[a]s the planet warms, scientists say that rapidly intensifying hurricanes are becoming ever more likely.”

First, it is important to note that just because a storm is among the most rapidly intensifying on record, it does not mean that there were not similar storms that went unrecorded.

As mentioned in a previous Climate Realism post about hurricane Erin, hurricane measurement technology is far advanced today than it was even a few decades ago.

Before the widespread use of Hurricane Hunter flights starting in the 70s where offshore storms were first closely monitored and directly measured throughout their lifespan, other rapidly intensifying storms would not have made the record.

So there is uncertainty about the record there.

Beyond that, attribution researchers and the NYT would like to blame hurricane intensification all on warm sea surface temperatures, but rapid intensification occurs in response to a variety of factors lining up just right.

Similar claims were made two years ago concerning Hurricane Otis. That storm also intensified rapidly over a single day, turning into a Category 5 before hitting the west coast of Mexico.

Otis did not intensify under expected conditions; thunderstorm bursts that forecasters were unable to predict are now believed to have been responsible for its rapid intensification.

Just as some scientists say more intense storms are more likely with warming, other scientists say that they will become less likely to form or less likely to strike land.

The NYT neglected to mention these perspectives, focusing its story on the scarier opinions that support the narrative that climate change is responsible for worsening extreme weather events.

In fact, as Climate at A Glance: Hurricanes details, there is no data suggesting hurricanes are becoming more frequent or more intense.

Study Finds Extreme Weather Database Exaggerates Global Disaster Trends

by Climate Discussion Nexus, Aug 28, 2025 in ClimateChangeDispatch 


Disasters don’t count if you don’t count them.

City flood aftermath
According to its publishers, a dataset called EM-DAT, which stands for Emergency Events Database, so it’s not even an acronym, lists “data on the occurrence and impacts of over 26,000 mass disasters worldwide from 1900 to the present day.” [emphasis, links added]

Which makes it perfect for studying long-term trends. And what’s even better, for the climate change crowd anyway, is that, as the authors of a 2024 study noted, “There are very strong upward trends in the number of reported disasters.”

But as the same authors noted in the very next sentence, “However, we show that these trends are strongly biased by progressively improving reporting.” Simply put, before 2000, reporting of small disasters that caused fewer than 100 deaths was hit-and-miss.

So, historically, the record of giant disasters that killed hundreds or more persons is reasonably complete, but not the record of small ones.

And the authors of the recent study argue that once they adjust for the effect of underreporting, the trends in disaster-related mortality go away.

The paper, “Incompleteness of natural disaster data and its implications on the interpretation of trends,” by a group of scientists in Sweden, began by noting that they are not the first to point out the problem.

The weird thing is that many authors who have pointed out this massive flaw have then gone ahead and used the data anyway, as though it did not exist, or at least they had not noticed it:

“Various authors (Field et al., 2012; Gall et al., 2009; Hoeppe, 2016; Pielke, 2021) have noted that there are reporting deficiencies in the EM-DAT data that may affect trends then proceeded to present trend analyses based on it without correction. Even the EM-DAT operators themselves discourage using EM-DAT data from before 2000 for trend analysis (Guha-Sapir (2023)). Yet recently, Jones et al. (2022) investigated the 20 most cited empirical studies utilising EM-DAT as the primary or secondary data source, and found that with only one exception the mention of data incompleteness was limited to a couple of sentences, if mentioned at all.”

Having made that point, their study then digs into the records and shows that in the post-2000 period, there is a steady pattern relating the frequency of events to the number of fatalities (F) per event.

It follows something that statisticians call “power-law behaviour” in which the more extreme an outcome, the rarer it is, not in a straight line but in an inverse exponential relationship, where extreme things, [like] large numbers of fatalities in a disaster, are a lot rarer than small numbers on a logarithmic curve. (For instance, in boating accidents, there are tens of thousands of individuals falling out and drowning for every Titanic.)

Hydrological, meteorological, and geophysical disasters all follow power-law behaviour in recent decades. But in earlier decades, the relationship doesn’t appear to hold because of a deficiency of low-fatality disasters in the data, rather than because it wasn’t still true then..