A new study indicates nearly all the Northern Hemisphere and Tropical warming in the last 40 years occurred by the late 1990s.
CO2 has risen by about 50 ppm since 1998 (367 to 418 ppm).
Interestingly, upper-air measurements of temperature from balloon-borne sensor radiosonde data, shown below in the image from a new study (Madonna et al., 2022), suggest there was more warming from the early 1980s to late 1990s – when CO2 only rose about 25 ppm (341 to 367 ppm) – than there has been this century.
Radiosonde measurements appear to depict mostly flat temperatures trends since 1998 in both the Northern Hemisphere (25°N to 70°N) and tropics (25°S to 25°N).
According to Dieng et al., 2017, global sea surface temperatures (SST) cooled slightly (-0.006°C/decade) from 2003 to 2013. This reduced the overall 1950-2014 warming rate to 0.059°C per decade.
Sea and land surface temperatures, ocean heat content, Earth’s
energy imbalance and net radiative forcing over the recent years.
The NCAR/HadCRUT4 global SST record from buoys and ARGO floats also show only modest warming in the last 3 decades. The natural 2015-’16 Super El Nino event is mostly responsible for the overall increasing rate.
This is the fourth in a series of articles on the IPCC’s AR6 WG1 report. –CCD ed.
Margaret Thatcher helped create the United Nations Intergovernmental Panel on Climate Change (IPCC) in 1988. As an Oxford-trained chemist, she understood scientific principles and was concerned that we “… do not live at the expense of future generations.”
By 2002, the Iron Lady turned against global warming extremism by stating in her book Statecraft: Strategies for a Changing World, “What is far more apparent is that the usual suspects on the left have been exaggerating the dangers and simplifying solutions in order to press their agenda…” [bold, links added]
Thatcher’s comments of exaggeration and simplification were a prescient critique of the IPCC report Climate Change 2021: The Physical Sciences Basis.
The IPCC uses computer simulations to predict climate dangers and test solutions. An important step in the computer simulation of a real-world physical process is making sure the simulator can replicate the known history of that physical process.
If a computer model can accurately replicate a significant history of a known process, called hindcasting, it lends credibility that the correct equations are being used and will be able to predict future events.
Does anyone wonder where all the global warming destruction is? After all, the media are unrelenting in telling us how much climate change caused by man is affecting us. Yet no existential threat has emerged. There’s something off with the story.
The climate alarmists have based their predictions of doom on computer models that have been projecting global temperature increases, the likes of which, they tell us, are unsustainable. We must cut our carbon dioxide emissions, even if (actually, especially if) it hurts developed world economies.
This is the narrative we’re bombarded with on a daily basis. And it’s wrong.
Those models that have been used to fuel the fright are, without a doubt, unreliable. According to a recent story published in Nature magazine written by a group of climate modelers, “a subset of the newest generation of models are ‘too hot’ and project climate warming in response to carbon dioxide emissions that might be larger than that supported by other evidence.”
The authors, though, are careful to preserve the narrative, warning that “whereas unduly hot outcomes might be unlikely, this does not mean that global warming is not a serious threat.” They can’t help themselves.
While the modelers in the Nature article point specifically to problems with “a subset of the newest generation of models,” it’s obvious that the older models are no better. Last fall we covered a ScienceDaily report which noted that some researchers had concluded “a possible flaw in climate models” had been exposed, as the models failed to reproduce an observed event.
“When the history of climate modeling comes to be written in some distant future,” economist Robert L. Bradley Jr. wrote some months ago for the American Institute for Economic Research, “the major story may well be how the easy, computable answer turned out to be the wrong one, resulting in overestimated warming and false scares from the enhanced (man-made) greenhouse effect.”
Permafrost developed from Termination Ia (Bölling interstadial, 14.5 cal ka BP) in Northern Iceland, in answer to deglaciation. Permafrost persisted or even re-extended during the Preboreal cooling events (at 11.2, 10.3 and 9.3 cal ka BP) synchronic with pulsated glacial advances. It disappeared below 1000 masl during the Thermal Optimum (8-5 cal ka BP). The present-day re-extent was controlled with the cooling related with the Little Ice Age and particularily the Maunder solar Minimum. Continuous permafrost is stable above 1000 masl, but is today melting between 900 and 800 masl. Discontinuous permafrost is vanishing today with the recent climate warming (from 1970), especially in palsa bogs, and on valley slopes with thermokarstic mass wasting.
The recession of the Hornbreen-Hambergbreen glaciers (Hornsund, Svalbard) will lead to the formation of a strait between the Greenland and Barents Seas within a few decades. We provide evidence for the earlier existence of this strait, in the Early–Middle Holocene and presumably since 1.3 ka cal. BP until glacier advance 0.7 ± 0.3 ka or earlier. Radiocarbon dating of mollusc shells from the ground moraines in the Hornbreen forefield indicate the existence of the marine environment at the contemporary glacierized head of Hornsund since 10.9 ka cal. BP or earlier due to glacier retreat. The gap in the radiocarbon dates between 3.9 and 1.3 ka cal. BP and the published results of 10Be exposure dating on Treskelen suggest the strait’s closure after glacier advance in the Neoglacial. Subsequent re-opening occurred around 1.3 ka cal. BP, but according to 10Be dates from Treskelen, the strait has again been closed since ca. 0.7 ± 0.3 ka or earlier. The oldest known surge of Hornbreen occurred around 1900. Analysis of Landsat satellite images, morphometric indicators characterizing the glacier frontal zones and previous studies indicate one surge of Hambergbreen (1957–1968) and five re-advances of Hornbreen in the 20th century (after 1936, between 1958 and 1962, in 1986–1990, 1998–1999, 2011). While the warmer Holocene intervals might be a benchmark for the effects of future climate change, glacier dynamics in post-Little Ice Age climate warming seems to be an analogue of glacier retreats and re-advances in the earlier periods of the Holocene.
Yes, I do know that acceleration, technically, means just a change in velocity. But, in every day English, we use acceleration to mean an increase in velocity – speeding up — and deceleration as a decrease in velocity – slowing down. I mention acceleration and deceleration because one of the major talking points of IPCC reported findings about sea level rise, the incessant media mantra, is that “Sea Level Rise is Accelerating”. (here, here, here, here, here and hundreds more here)
Is sea level rising? Yes, of course it is. It has been rising since about 1750-1775, coinciding with the end of the Little Ice Age. This is widely accepted as shown below:
There’s a few things to note at first glance. The ice floe continued to decrease in thickness into November. It’s thickness then started to increase, but is currently still less than 2 meters. Also the snow depth has gradually been increasing, and (apart from some data glitches!) is now ~38 cm. Finally, for the moment at least, the ice surface temperature has been slowly warming since mid February and is now ~-11 °C.
New studies on the Atlantic current system assess the threshold between natural fluctuations and a climate change-driven evolution
25 April, 2022/Kiel, Germany. With a new publication in the scientific journal Nature Climate Change, researchers from Kiel once again contribute to the understanding of changes in the Atlantic Meridional Overturning Circulation (AMOC) – also known as the “Gulf Stream System”. It is important both for the global climate as well as for climate events in Europe. The authors focus on the question whether human-induced climate change is already slowing down this oceanic circulation. According to the new study, natural variations are still dominant. Improved observation systems could help detect human influences on the current system at an early stage.
Is the Atlantic Meridional Overturning Circulation (AMOC) slowing down? Is this system of ocean currents, which is so important for our climate, likely to come to a halt in the future? Are the observed variations a natural phenomenon or are they already caused by human-induced climate change? Researchers from various scientific disciplines use a wide range of methods to better understand the gigantic oceanic circulation.
“The AMOC provides Europe with a mild climate and determines seasonal rainfall patterns in many countries around the Atlantic. If it weakens over the long term, this will also affect our weather and climate. Other consequences could be a faster rise in sea levels at some coasts or a reduction in the ocean’s ability to take up carbon dioxide and mitigate climate change”, Professor Dr. Mojib Latif, Head of the Research Unit: Marine Meteorology at GEOMAR Helmholtz Centre for Ocean Research Kiel, explains. “We depend on the AMOC in many ways – but so far, we can only guess how it will develop, and whether and how strongly we humans ourselves will push it towards a tipping point where an unstoppable collapse will take its course.”
Using observational data, statistical analyses and model calculations, a team led by Professor Latif has therefore examined changes in the current system over the past one hundred years in greater detail. The results have now been published in the scientific journal Nature Climate Change. According to the researchers, part of the North Atlantic is cooling – a striking contrast to the majority of ocean regions. All evaluations indicate that since the beginning of the 20th century, natural fluctuations have been the primary reason for this cooling. Nonetheless, the studies indicate that the AMOC has started to slow down in recent decades.
A video of Indian historian and journalist Vijay Prashad pointing out US double standards on climate change vis-à-vis developing nations is being widely shared on social media to highlight the “colonial mindset” of the West.
In the video, recorded during Prashad’s participation in a panel discussion at the United Nations Climate Change Conference (COP26), the historian pointed out that the United States makes up 4 to 5% of the world’s population but was still using 25% of global resources.
“You (United States) love lecturing us because you have a colonial mentality. Then there are colonial structures and institutions that lend us money, which is our money. The IMF comes to our societies, you give us our money back as debt and lecture us on how we should live,” Prashad said at the summit held in Glasglow, Scotland, from October 31 to November 13.
Prashad further said that climate change talks at similar summits could not succeed due to this “colonial mentality”. He also talked about the United States attacking China over its coal production and emissions targets.
Every year, tropical hurricanes affect North and Central American wildlife and people. The ability to forecast hurricanes is essential in order to minimize the risks and vulnerabilities in North and Central America. Machine learning is a newly tool that has been applied to make predictions about different phenomena. We present an original framework utilizing Machine Learning with the purpose of developing models that give insights into the complex relationship between the land–atmosphere–ocean system and tropical hurricanes. We study the activity variations in each Atlantic hurricane category as tabulated and classified by NOAA from 1950 to 2021. By applying wavelet analysis, we find that category 2–4 hurricanes formed during the positive phase of the quasi-quinquennial oscillation. In addition, our wavelet analyses show that super Atlantic hurricanes of category 5 strength were formed only during the positive phase of the decadal oscillation. The patterns obtained for each Atlantic hurricane category, clustered historical hurricane records in high and null tropical hurricane activity seasons. Using the observational patterns obtained by wavelet analysis, we created a long-term probabilistic Bayesian Machine Learning forecast for each of the Atlantic hurricane categories. Our results imply that if all such natural activity patterns and the tendencies for Atlantic hurricanes continue and persist, the next groups of hurricanes over the Atlantic basin will begin between 2023 ± 1 and 2025 ± 1, 2023 ± 1 and 2025 ± 1, 2025 ± 1 and 2028 ± 1, 2026 ± 2 and 2031 ± 3, for hurricane strength categories 2 to 5, respectively. Our results further point out that in the case of the super hurricanes of the Atlantic of category 5, they develop in five geographic areas with hot deep waters that are rather very well defined: (I) the east coast of the United States, (II) the Northeast of Mexico, (III) the Caribbean Sea, (IV) the Central American coast, and (V) the north of the Greater Antilles.
This week, I’m grateful for the opportunity to participate in a conference in Brussels on “science advice under pressure,” organized by the European Commission’s Science Advisory Mechanism (it is streaming online if you’d like to join in today and tomorrow). I am on a panel today with Anne Glover (former science advisor to the European Commission), Matthew Flinders (University of Sheffield) and Lara Pivodic (Vrije Universiteit Brussels). Our moderator has asked us to begin today’s conversation by answering the following question:
What are your experiences (either personal or among colleagues) of coming under pressure and facing hostility a result of being a prominent science advisor giving advice in public?
As I have considered this question, my first response was: Have a seat, grab a cup of coffee, and how much time do you have?
While sea ice has been declining off the Greenland Sea (east of the island), the Chucki Sea (eastern Siberia) shows a very different trend in sea ice extent over the past year. Such deviations have occurred repeatedly since the year 2000.
Overall, the 2021 extent was very close to the 1991-2020 mean and well above the lowest value in 2012 and also above what was recorded in the year 2020
Beijing [China], April 27 (ANI): Putting the pledges to reach net-zero emissions by 2060 at bay, China is still leading the world in building new coal plants, showed a major annual survey.
Global Energy Monitor’s (GEM) eighth annual survey of the world’s coal plant pipeline on Tuesday reveals that China, the world’s top greenhouse gas polluter, continued to lead all countries in the domestic development of new coal plants, commissioning more new coal capacity in 2021 than the rest of the world combined, reported Straits Times.
Furthering the worries, China’s coal consumption is meant to peak in 2025. China is the world’s largest assembly of coal power plants and this building of new coal plants is posing a critical risk of climate change.
Lauri Myllyvirta, lead analyst for research organisation Centre for Research on Energy and Clean Air, which contributed to GEM’s report said, “The power industry’s plan, which appears to have [Chinese] government backing, at least for now, is for coal power capacity to increase until 2030. So new plants are adding more capacity, not just replacing retirements. Last year saw retirements, in fact, slow down.”
By the end of 2021, a total of 176GW of coal capacity was under construction in 20 countries, which is slightly less than in 2020. China represented more than half (52 per cent) of that capacity, and countries in South Asia and Southeast Asia made up a total of 37 per cent.
Flora Champenois from GEM said, “The coal plant pipeline is shrinking, but there is simply no carbon budget left to be building new coal plants. We need to stop, now.”
The report also found that total coal power capacity under development declined 13 per cent last year. Moreover, with the ongoing war in Ukraine, some countries are delaying coal plant retirements as they scramble for energy supplies to keep the lights on.
According to the analysts, this move might temporarily set back global efforts to phase out coal, though soaring coal prices might also prompt some nations to speed up investment in green energy.
Myllyvirta told Straits Times, “The world is experiencing a fossil fuel price shock, leading to overall reduced fossil fuel demand and accelerated plans for clean energy development, reported Straits Times.
“European countries, in particular, have announced very ambitious plans for clean energy and energy efficiency as a part of the effort to end reliance on fossil fuel imports from Russia. These factors will lower coal demand in the coming years,” he added. (ANI)
This report is auto-generated from ANI news service. ThePrint holds no responsibility for its content.
The general view in society is that human emissions of CO₂ are the all-determining cause of the increased concentration in the atmosphere. Most scientists and even many climate skeptics do not question this. There is some debate about how long this extra CO₂ will stay in the atmosphere, but that’s about it. That’s remarkable, as several scientists have published extensively on the flaws and inconsistencies of this narrative. By looking at the significant increase in the CO₂-flows from and to land and sea it’s in fact easy to see that the CO₂-rise is largely due to natural causes.
The idea that human CO₂ is the all-determining cause of the increased concentration is based on the assumption that the natural inflows and outflows are always and exactly in equilibrium with each other. Based on this perfect equilibrium thinking, human emissions, even though they are relatively small, cause a perturbation year after year. In the so-called global carbon budget about 10 PgC of CO₂ is added every year, while the absorption flux has only increased by 6 PgC/yr (1 Petagram = 1 Gigaton = 1 billion tons). The concentration therefore continues to rise indefinitely as long as people emit CO₂.
To support this idea it is also assumed that human emissions accumulate in the atmosphere. Where you would expect a single residence time for a reservoir with in- and outflows, the IPCC-models calculate with a small residence time of about 4 years for natural CO₂ and a large one for human CO₂: “The removal of all the human-emitted CO2 from the atmosphere by natural processes will take a few hundred thousand years (high confidence)”.
Several scientists, including Murray Salby and Hermann Harde, have published extensively on the flaws and inconsistencies of this narrative. They also showed that it is very illogical to think that a slight increase in the up-flux cannot be compensated by a larger down-flux. It’s like increasing the heat energy flow in a house by 5% and expecting that the temperature will keep on rising forever.
Despite this, belief in the IPCC’s model for the increase in concentration is persistent. In this article we will focus on one of the strangest assumptions: the idea that the in- and outflows are stable and in perfect equilibrium. Although they are about 20 times larger than anthropogenic fluxes and have different drivers for up and down, natural flows are not included in the material balance used in the models.
It is in fact easy to see that the increase in the CO₂ concentration is for the most part the result of natural changes, based on the following unmistakable observations.
Fluxes to and from land and sea have increased significantly since 1750.
The increase in these fluxes is natural, i.e. not due to human emissions.
The growth of the natural fluxes can only take place at a higher concentration in the atmosphere.
A new study published in Geophysical Research Lettershighlights the abysmal model performance manifested in the latest Intergovernmental Panel on Climate Change report (AR6). The 38 CMIP6 general circulation models (GCMs) fail to adequately simulate even the most recent (1980-2021) warming patterns over 60 to 81% of the Earth’s surface.
Dr. Scafetta places particular emphasis on the poor performance of the highly uncertain estimates (somewhere between 1.83 and 5.67°C) of equilibrium climate sensitivity (ECS) and their data-model agreement relative to 1980-2021 global warming patterns.
The worst-performing ECS estimates are the ones projecting 3-4.5°C and 4.5-6°C warming in response to doubled CO2 concentrations (to 560 ppm) plus feedbacks, as the 1980-2021 temperature trends are nowhere close to aligning with these trajectories.
Instead, the projected global warming by 2050 (~2°C relative to 1750) associated with the lowest ECS estimates and implied by the warming observed over the last 40+ years is characterized as “unalarming” even with the most extreme greenhouse gas emissions (no mitigation efforts undertaken) growth rate.
In addition to the conclusion that “no model group succeeds reproducing observed surface warming patterns,” poor modeling of heat transfer physics, ocean and atmospheric circulation patterns, polar sea ice processes…is also evident in the latest IPCC report.
“Accurately reproducing regional temperature differences over the past 40+ years is beyond the capability of climate model simulations, and even fails for major ocean basins and continents.”
The fundamental modeling failures in simulating responses to sharply rising greenhouse gas emissions over the last 40+ years “calls into question model-based attribution of climate responses to anthropogenic forcing.”
A 2021 study appearing in Nature Communications by Buentgen et al reports on the results of a double-blind experiment of 15 different groups that yielded 15 different Northern Hemisphere summer temperature reconstructions. Each group used the same network of regional tree-ring width datasets.
What’s fascinating is that ll groups, though using the same data network, came up with a different result. When it comes to deriving temperatures from tree-rings, it has much to do with individual approach and interpretation. Sure we can follow the science, but whose results?
The 15 groups (referred to as R1–R15) were challenged with the same task of developing the most reliable NH summer temperature reconstruction for the Common Era from nine high-elevation/high-latitude TRW datasets (Fig. 1):Cropped from Figure 1, Buentgen et al
Air temperatures 4Surface: spatial pattern 4 Lower Troposphere: monthly 6 Lower Troposphere: annual means 7 Surface: monthly 8 Surface: annual means 10 Error, consistency and quality 11 Surface versus lower Troposphere 14Lower Troposphere: land versus ocean 15 By altitude 16 Zonal air temperatures 17 Polar air temperatures 18
Received on 18 February 2022; revised on 20 March 2022; accepted on 22 March 2022
The “100,000-year problem” refers to an apparent unexplained change in the frequency of inter-glacial periods which occurred about a million years ago. Before that, inter-glacial periods seemed to occur about every 41,000 years, in line with the obliquity Milankovich cycle. But after that, they seemed to occur about every 100,000 years, in line with the orbital inclination Milankovich cycle. Examination of the data shows that there never was a 41,000-year cycle, and that there is no 100,000-year cycle, but that the most influential cycle is the approx 21,000-year precession cycle which is the major factor in the cycles of insolation at higher latitudes. Insolation at 65N is generally regarded as the most significant of these. Inspection of the data shows that every glacial termination (start of an inter-glacial period) began at a time when insolation at 65N increased from a low point in its cycle. That not every such cycle triggered a new inter-glacial period underlines the chaotic non-linear nature of Earth’s climate. Until about a million years ago, this cycle occasionally “missed a beat”, making the inter-glacial frequency average about 41,000 years. After that, the cycle started missing more “beats”, making the inter-glacial frequency average about 100,000 years. There never was an actual 41,000-year or 100,000-year inter-glacial cycle.
The claimed warming rate during the (1998-2001 to 2012-’13) “hiatus” ranged from -0.07°C to +0.17°C per decade.
In late 2012, the IPCC had an ongoing dilemma about what to do about the uncooperative global temperatures. The HadCRUT3 data set government bureaucrats had been using since the first report in 1990 actually showed the global mean surface temperatures had been declining since 1998. This was not going further the we-must-act-on-global-warming-now narrative, of course.
Enter Phil Jones, the global temperature data set overseer at East Anglia’s Climate Research Unit (CRUTEM). He’s the scientist who famously admitted that when the temperature data doesn’t exist, they are “mostly made up.”
Jones’s CRU and the Met Office (Hadley) then jointly constructed the newer HadCRUT4 version to help advance the narrative. This version changed the data just in time for the 5th IPCC assessment (AR5, 2013). The 1998-2001 temperatures were allowed to stay the same, but an additional 0.1 to 0.2°C was tacked on to anomalies from 2002 onwards. The effect was to transform the 1998-2012 slight cooling in HadCRUT3 into a 0.04°C per decade−1 warming in HadCRUT4.
The Climate Feedback website critiques my CO2Coalition article “Attributing global warming to humans.”Their factcheck is here. Like most “fact checks” these days it is a thinly disguised opinion piece. The statement that they claim is incorrect is:
“There is no evidence, other than models, that human CO2 emissions drive climate change and abundant evidence that the Sun, coupled with natural climate cycles, drives most, if not all, of recent climate changes, as described in Connolly, et al., 2021.” [emphasis added]
They cleverly leave out the last phrase: “as described in Connolly, et al., 2021,” and then immediately assert “Solar irradiance has had a negligible impact on Earth’s climate since the industrial era.” This is followed by no evidence other than an appeal to the mythical “consensus.”
Later in the article, they say Connolly, et al. uses simple linear regression to establish a link between solar irradiance and surface temperature. Connolly, et al. does not state that the Sun controls the climate or that humans do, it simply shows that, using available evidence, solar variability (actually TSI, or Total Solar Irradiance variability) could account for anywhere from 0 to 100% of the warming since the Little Ice Age (the so-called “pre-industrial” era). One of the main points of Connolly, et al. is that the IPCC and the so-called “consensus” are ignoring two critical areas of current research. First, they ignore the uncertainty in our estimate of surface warming since the Little Ice Age, and second, they ignore the considerable uncertainty in solar-variability-long-term trends, both recently and since the Little Ice Age. As they state in the paper, the amount of 20th century warming that can be simulated as due to solar variability, depends upon the surface temperature dataset and the solar TSI model used. There are many versions of both. Suffice it to say, while the exact influence of human activities and solar variability on climate change are both unknown, no one can claim solar influence is negligible. The correct answer is we don’t know.
It was supposed to be a groundbreaking forecast, the early prediction of the weather phenomena El Niño and La Niña. Both affect the weather in very different ways.
It would have been so nice to know a year in advance what conditions would prevail at a later date. On November 4, 2019, the Potsdam Institute for Climate Impact Research published what was held as groundbreaking news. Thanks to its new algorithms and a lot of computing power, it was now possible to predict an El Niño or a La Niña a long time in advance. The hit rate was supposed to be 80%.
Unfortunately, one year later exactly the opposite of what was predicted in fact happened, the German Klimaschau reported. Science is settled? Well, maybe not.
Since then, things have been quiet about these PIK long-term forecasts. The US agency NOAA is much more cautious, both in terms of the long term and the probability of occurrence. Perhaps they don’t have as good algorithms and the computing power that Potsdam has? In any case on Twitter, US meteorologist Ryan Maue sees a good chance that a third will follow the two recent consecutive La Niñas.
The last time this happened was 22 years ago at the turn of the millennium.
Guest post by Christopher Essex, Emeritus Professor of Mathematics and Physics, University of Western Ontario.
By Dr Christopher Essex
It is well known that daytime winter temperatures on Earth can fall well below -4°F (-20℃ ) in some places, even in midlatitudes, despite warming worries. Sometimes the surface can even drop below -40°F (-40℃ ), which is comparable to the surface of Mars. What is not so well known is that such cold winter days are colder than they would be with no atmosphere at all!
How can that be if the atmosphere is like a blanket, according to the standard greenhouse analogy? If the greenhouse analogy fails, what is climate?
Climate computer models in the 1960s could not account for this non-greenhouse-like picture. However modern computer models are better than those old models, but the climate implications of an atmosphere that cools as well as warms has not been embraced. Will computer models be able to predict climate after it is? The meteorological program for climate has been underway for more than 40 years. How did it do?
Feynman, Experiment and Climate Models “Model” is used in a peculiar manner in the climate field. In other fields, models are usually formulated so that they can be found false in the face of evidence. From fundamental physics (the Standard Model) to star formation, a model is meant to be put to the test, no matter how meritorious.
The average sea ice cover at the end of March is the metric used to compare ‘winter’ ice to previous years or decades, not the single-day date of ‘most’ ice. This year, March ended with 14.6 mkm2 of sea ice, most of which (but not all) is critical polar bear habitat. Ice charts showing this are below.
But note that ice over Hudson Bay, which is an almost-enclosed sea used by thousands of polar bears at this time of year, tends to continue to thicken from March into May: these two charts for 2020 show medium green becoming dark green, indicating ice >1.2 m thick, even as some areas of open water appear.
La géologie, une science plus que passionnante … et diverse