
Researchers uncover a possible explanation for the abrupt temperature spike in 2023: a reduction in low-level cloud cover diminishes Earth’s capacity to reflect solar radiation.
Rising sea levels, melting glaciers, and marine heatwaves—2023 broke numerous alarming records. Among them, the global mean temperature climbed to nearly 1.5°C above preindustrial levels, marking an unprecedented high. Researchers face a significant challenge in pinpointing the causes of this sudden spike. While factors such as human-driven greenhouse gas accumulation, the El Niño weather phenomenon, and natural events like volcanic eruptions explain much of the warming, they don’t fully account for it.
Notably, there remains an unexplained gap of about 0.2°C in the global temperature rise. A team from the Alfred Wegener Institute proposes a compelling hypothesis: the Earth’s surface has become less reflective due to a decline in certain types of clouds. This reduction in reflectivity may help explain the additional warming.
“In addition to the influence of El Niño and the expected long-term warming from anthropogenic greenhouse gases, several other factors have already been discussed that could have contributed to the surprisingly high global mean temperatures since 2023,” says Dr Helge Goessling, main author of the study from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI): e.g. increased solar activity, large amounts of water vapor from a volcanic eruption, or fewer aerosol particles in the atmosphere. But if all these factors are combined, there is still 0.2 degrees Celsius of warming with no readily apparent cause.

“The 0.2-degree-Celsius ‘explanation gap’ for 2023 is currently one of the most intensely discussed questions in climate research,” says Helge Goessling. In an effort to close that gap, climate modelers from the AWI and the European Centre for Medium-Range Weather Forecasts (ECMWF) took a closer look at satellite data from NASA, as well as the ECMWF’s own reanalysis data, in which a range of observational data is combined with a complex weather model. In some cases, the data goes back to 1940, permitting a detailed analysis of how the global energy budget and cloud cover at different altitudes have evolved.
“What caught our eye was that, in both the NASA and ECMWF datasets, 2023 stood out as the year with the lowest planetary albedo,” says co-author Dr Thomas Rackow from the ECMWF. Planetary albedo describes the percentage of incoming solar radiation that is reflected back into space after all interactions with the atmosphere and the surface of the Earth. “We had already observed a slight decline in recent years. The data indicates that in 2023, the planetary albedo may have been at its lowest since at least 1940.” This would worsen global warming and could explain the ‘missing’ 0.2 degrees Celsius. But what caused this near-record drop in planetary albedo?
Decline in lower-altitude clouds reduces Earth’s albedo
The albedo of the surface of the Earth has been in decline since the 1970s – due in part to the decline in Arctic snow and sea ice, which also means fewer white areas to reflect back sunlight. Since 2016, this has been exacerbated by sea-ice decline in the Antarctic. “However, our analysis of the datasets shows that the decline in surface albedo in the polar regions only accounts for roughly 15 percent of the most recent decline in planetary albedo,” Helge Goessling explains.
And albedo has also dropped markedly elsewhere. In order to calculate the potential effects of this reduced albedo, the researchers applied an established energy budget model capable of mimicking the temperature response of complex climate models. What they found: without the reduced albedo since December 2020, the mean temperature in 2023 would have been approximately 0.23 degrees Celsius lower.
Implications of Lower Cloud Cover
One trend appears to have significantly affected the reduced planetary albedo: the decline in low-altitude clouds in the northern mid-latitudes and the tropics. In this regard, the Atlantic particularly stands out, i.e., exactly the same region where the most unusual temperature records were observed in 2023. “It’s conspicuous that the eastern North Atlantic, which is one of the main drivers of the latest jump in global mean temperature, was characterized by a substantial decline in low-altitude clouds not just in 2023, but also – like almost all of the Atlantic – in the past ten years.” The data shows that the cloud cover at low altitudes has declined, while declining only slightly, if at all, at moderate and high altitudes.
The fact that mainly low clouds and not higher-altitude clouds are responsible for the reduced albedo has important consequences. Clouds at all altitudes reflect sunlight, producing a cooling effect. But clouds in high, cold atmospheric layers also produce a warming effect because they keep the warmth emitted from the surface in the atmosphere. “Essentially it’s the same effect as greenhouse gases,” says Helge Goessling. But lower clouds don’t have the same effect. “If there are fewer low clouds, we only lose the cooling effect, making things warmer.”
But why are there fewer low clouds? Lower concentrations of anthropogenic aerosols in the atmosphere, especially due to stricter regulations on marine fuel, are likely a contributing factor. As condensation nuclei, aerosols play an essential part in cloud formation, while also reflecting sunlight themselves. In addition, natural fluctuations and ocean feedbacks may have contributed. Yet Helge Goessling considers it unlikely that these factors alone suffice and suggests a third mechanism: global warming itself is reducing the number of low clouds.
“If a large part of the decline in albedo is indeed due to feedbacks between global warming and low clouds, as some climate models indicate, we should expect rather intense warming in the future,” he stresses. “We could see global long-term climate warming exceeding 1.5 degrees Celsius sooner than expected to date. The remaining carbon budgets connected to the limits defined in the Paris Agreement would have to be reduced accordingly, and the need to implement measures to adapt to the effects of future weather extremes would become even more urgent.”
Reference: “Recent global temperature surge intensified by record-low planetary albedo” by Helge F. Goessling, Thomas Rackow and Thomas Jung, 5 December 2024, Science.
DOI: 10.1126/science.adq7280
Never miss a breakthrough: Join the SciTechDaily newsletter.
Follow us on Google and Google News.
30 Comments
Thank you forbthis information.
“While factors such as human-driven greenhouse gas accumulation, the El Niño weather phenomenon, and natural events like volcanic eruptions explain much of the warming, they don’t fully account for it.”
The answer may be staring them in the face, but because of poor practices of data processing, it isn’t readily apparent. Nowhere in the article, or linked abstract, is there any mention of the precision (aka margin of error) in the measurements. When calculations are performed, there are accepted practices for the propagation of error, where the errors are always additive EXCEPT when the same thing is measured multiple times with the same instrument, under the same conditions, to obtain an average.
In climatology, many different thermometers are used to measure different moving air masses of different densities, pressures, water content, and temperature. Just as Heraclitus observed that one never steps into the same river twice, meteorologists/climatologists never measure the same air parcel even twice with the same thermometer over the course of a year. The properties of even the same air mass, let alone different air masses, change with location and time because the things that affect temperature change with the seasons; the thermometer calibration may change with temperature and electronic devices may ‘drift’ as components age; the buffering effect of varying absolute humidity is not taken into account, and lastly, the Stevenson Screens, which hold the thermometers, age and can be affected by dust, snow, and aging of the paint. Thus, it is impossible to meet the requirement that the conditions of measurement remain constant on the same object. One can do that in a temperature-controlled lab when doing something like measuring the diameter of a single ball-bearing with a micrometer, but not in the field with different observers and different measuring devices.
While climatologists rationalize improving the precision of the global average by dividing by the square-root of the number measurements rather than the number of measurements, they are not justified typically in doing so. This is important because the one-tailed test for whether a sample belongs to a particular population depends on having a good measure of the standard deviation.
While temperature averages are commonly reported to two or even 3 significant figures to the right of the decimal point (using thermometers that may have only been read to the nearest degree), doing so does not agree with the Empirical Rule that predicts that an approximate estimate of +/- two standard deviations (~95% probability) of a number of samples should be about 1/4th of the range of a normally-distributed population. Based on that, the global annual arithmetic mean should have an uncertainty of several tens of degrees, not the hundredths or thousandths commonly reported, and used to declare one year is warmer than another year.
Sometimes one can infer the estimated precision by the number of significant figures in the reported measurement, as long as one can trust that rounding off is done appropriately and consistently. However, after reading many, many reports, I don’t believe that one can make that assumption, even when uncertainties are provided, along with whether it represents one (68%) or two (95% probability) standard deviations.
Taking into account the uncertainty of a historical average, or even a single year, in comparison with some target year such as 2024, I doubt that a rigorous assessment would support the claim that there is a 0.2 degree C discrepancy.
“While temperature averages are commonly reported to two or even 3 significant figures to the right of the decimal point (using thermometers that may have only been read to the nearest degree),”. If that’s happening, that is an absurdity.
“Based on that, the global annual arithmetic mean should have an uncertainty of several tens of degrees,”. I think those uncertainties would be detectable by our physiologies. Freezing cold; bloody cold; cold; mild; warm; hot; bloody hot; stinking hot; boiling hot. Although humidity does come into play, as does wind-chill. 9 divisions spaced across-40C to +40C, so the uncertainties surely can’t add up several tens of degrees, whatever the statistical theorising is going on.
Point taken about the 0.2degrees.
You have to understand that global sampling is less than optimal, with most weather stations being located in first-world countries, and most sea surface temperatures (which shouldn’t be conflated with air temperatures because of the difference in specific heat) are confined to shipping lanes. Thus, the computed global average is missing a lot of data, that, were it available, would change the calculated mean. Also, since the Empirical Rule is more of a sanity check for order of magnitude, and really only depends on two numbers, it might be more appropriate to divide by 6, rather than 4. Finally, the actual probability distribution appears to be more skewed than my numbers suggest, with a long cold-tail.
The uncertainty applies to the accuracy of the calculated mean, not the accuracy of the thermometer. See my other comment.
When I lived in Vermont, I frequently saw temperatures below zero Fahrenheit. However, the best I could do without access to a thermometer was to note that when I first stepped out of the house, and drew my first breath, if I could feel the hairs in my nose stiffen as the moisture on them froze, I assumed that it was about -10 deg F; if I spit, and my spittle froze before hitting the ground it was probably about at least -20 deg F; if my relatively new V8 engine couldn’t generate enough heat, without blocking the radiator, to run smoothly it was probably pushing -40 deg F or C. Physiology is very imprecise, with a range of about 10 degrees for a discernible difference. That is why thermometers were invented.
My 9 divisions spread across 80 degrees corresponds within reasonable physiological constraints to your experiences in Vermont. Interesting that your V8 engine had the same difficulty as had the blood supply to my earlobes in still air at about the same temperature in Antarctica. Not sure what that means, apart from that both your V8 engine and my earlobes needed insulation to stay warm.
9-41
I doubt “thermometers wearing out” is an issue. Thermocouples can easily measure temperature out to 20 decimal places and other than dissolving one in a strong acid they are basically indestructible.
Outside of theft or vandalism this is a non issue. The cheapest hobby grade thermocouples have a variance in the +/- .002 range and those are well under $5 a piece. Scientific or even industrial grade thermocouples have orders of magnitude less variation than that.
A small and probably negligible variable would be changes to the surrounding environment. I know at the airport where I live they repaved under where the temperature readings are taken (I believe they went from blacktop to concrete but I’m not 100% sure of that) anyway they had to introduce a fudge factor to account for this change and keep the data consistent.
I must say also I find your dismissal of climate scientists to be elitist and quite demeaning. To suggest that they don’t have as complete a grasp on statistics, data collection and analysis, is farcical, and leads me to believe you are more interested in pushing an agenda than any scientifically valid critique.
Who said anything about thermometers “wearing out?” I didn’t. I have done a global search for the word “out” and did not find anyone else that did either. You are inventing ‘facts’ to support a straw man argument. If electronic circuitry drifts (which it invariably does), it can be re-calibrated, IF there is a procedure in place for routinely checking the calibration of the instruments.
Just because electronic circuitry is capable of displaying “20 decimal places” does not mean that all the displayed digits have significance or even meaning. Observing a digital voltmeter, which is essentially the translator for the thermocouple, one usually sees the displayed voltage readout flickering so rapidly for the least significant digits that it isn’t even possible to identify them.
As to electronics ‘drifting,’ which I did mention, it isn’t the platinum that ages or changes with temperature, instead, it is the resistors and capacitors, and maybe the gain in the transistors. Again, you present a straw man argument.
You said, “A small and probably negligible variable would be changes to the surrounding environment.” You are quite wrong. That is the most important variable, There are times when even the human body can feel changes (hotter or colder) in the wind blowing by because of turbulence bringing air parcels from different locations and mixing them. And, it is the sampling protocol of how and where weather stations are located that is most often criticized, along with the time constant of the ‘thermometer’ response that creates problems in obtaining comparable averages.
I am myself a scientist. Any agenda such as I might have is driven primarily by a concern about the poor quality of research in the field of climatology. Notably, how rare it is for published papers to demonstrate the accepted procedure for displaying significant figures in a measurement or demonstrating a rigorous propagation of error in a calculation. On the rare occasion that an uncertainty range (aka margin of error) is provided, it is even rarer to find the authors mentioning whether it is one or two-sigma. Although, a little sleuthing sometimes shows that it is most commonly 1-sigma (68% probability), instead of the more common 2-sigma (95% probability) used in most other disciplines.
If, indeed, climatologists and others playing in the shallow end of the pool are conversant in “statistics, data collection and analysis,” as you presume, the alternative explanation for their behavior is even less flattering. However, as anecdotal evidence that you are wrong, Michael Mann (who has an academic background similar to mine, in geophysics) is most famous for the creation of the global warming hockey stick. The most compelling criticism of his work, by a statistician, is that the algorithm he created tends to produce ‘hockey sticks’ from even random noise.
The bottom line is that in science, even if a researcher has a political agenda, it is the function of peer review to present counter evidence and analyze the logic presented to ferret out reality. Consider that this forum is a form of actual peer review, not “pal review” or a couple of ‘gate keepers’ that help maintain the reputation of profitable journals by weeding out the latest design for a perpetual motion machine.
“Planetary albedo describes the percentage of incoming solar radiation that is reflected back into space after all interactions with the atmosphere and the surface of the Earth.”
That is true to a first-order approximation, when the reflector produces a diffuse reflectance. (Strictly speaking, planetary albedo is the apparent brightness of an object, like the moon, that has a rough surface.) If a laboratory-derived, hemispherical, Bidirectional Reflectance Distribution Function (BRDF) is available for the reflector (e.g. snow or clouds) then the theoretical amount of light going back into space can be calculated, or estimated from a spot measurement.
However, what happens with water is more challenging. CERES measurements have attempted to estimate the specular reflectance. Unfortunately, where the reflectance is changing most rapidly, near a glancing angle of incidence (~90 deg), the measurements are “binned.” That means the observing field of view is large compared to the rate of change, and gives an underestimate of the outgoing flux. This is further complicated by the fact that the water surface is seldom perfectly flat, having waves and ripples, and besides the surface reflectance, there is diffuse reflectance from sediment and plankton in the water.
However, most imaging satellites employ nadir viewing — straight down, or nearly so — and miss most of the outgoing specular reflectance. If there are fewer clouds, the satellites will be seeing more water and missing more of the outgoing specularly reflected light.
Oceanographers, biologists, and glaciologists are typically in over their heads when it comes to dealing with light optics. Thus, most of the journal publications present a lower-bound on out-going light as measured by satellites. Most of them are unacquainted with Fresnel’s Equation for reflectance and transmittance. But that is OK. Water only accounts for about 71% of the Earth’s surface.
https://wattsupwiththat.com/2016/09/12/why-albedo-is-the-wrong-measure-of-reflectivity-for-modeling-climate/
Very nice discussion/explanation of measurement and its limitations.
The albedo your referencing is from earths #1 greenhouse gas: water vapor. So what sensors record the solar wind that enters our atmosphere as northern lights? 2024 NASA stated was solar maximum meaning 2023 was also very active sunspot wise. Are simple 11 year cycles not correlated to some of these “unexplained” events. If maybe; then let’s look at longer cycles. I mean the magnetic north polar wander started it’s current excursion around 1900!? 124ish years ago well before the industrial revolution. I believe we have no idea the effects of an unwinding dynamo on climate, wind patterns, solar wind, mass redistribution(mostly water). “Knowing is half the battle” GI Joe
Observations have documented that, during the peak of the sunspot cycle, the average energy impinging on the top of the atmosphere increases, primarily because of a larger flux of UV. However, modelers largely ignore that because it is a change of only about 1% in the total solar energy over a nominal period of 11 years. They have bigger fish to fry.
The poles have always wandered, and have even flipped polarity. While the impact on high-energy particles hasn’t been ruled out, I’m unaware of any compelling argument(s) that the effect has other then a negligible impact on surface temperatures.
It is not very clear how this mean temperature is computed. It is well known the temperature is extremely variable around the globe. What I understood is the single temperature riser taken into account is the Sun, but in my opinion the humanity is adding a non-négligeable amount of heat directly in the air and in the water. The temperature in any large city is 2-3 degrees higher than outside of it, and the surface of urban areas increased a lot. The same is valid for large industrial areas that are not part of the urban ones.
A mid-range temperature ([daily high – daily low]/2) is computed for every weather station. It is not even a true arithmetic mean of hourly temperatures, and thus is a poor choice for rigorous statistical analysis. However, for most of the historical record, Tmax and Tmin are the only measurements available. Depending on the purpose of additional parameters, such as weekly, monthly, or annual variations, one can then obtain a true arithmetic mean of the daily mid-range values for all the stations. However, again, there is no agreement on the protocol for handling stations that are dropped or added, or have different temperature sensors installed. Frequently, there isn’t even a calibration overlap where measurements are taken from both the old and new sensors as should be done. I’m not sure that there is any agreement on how the aggregations should be handled and may depend on what is most easily accessible for the researcher.
In summary, the uncertainties in measurements are almost certainly larger than generally acknowledged, even when the formal uncertainty estimate is mentioned, which it frequently isn’t. It would actually make more sense to not do the mid-range calculations and present the Tmax and Tmin time-series together, and now that computers are ubiquitous, there is little excuse for not doing so. This is important because the two series behave differently, with more warming at night and in the Winter. Using the mid-range ‘average’ confounds assessing how the two series are behaving and makes it more difficult to assign causation.
http://wattsupwiththat.com/2015/08/11/an-analysis-of-best-data-for-the-question-is-earth-warming-or-cooling/
Yes, it’s remarkable that the climate has warmed about one degree C. with two decimal place precision above some other poorly known value to become an existential crisis. But at least it has been acknowledged here that there is little that can realistically be done to mitigate it so we must adapt….
“the need to implement measures to adapt to the effects of future weather extremes would become even more urgent.”
“Adaptation” to the degrees we’re heading toward is not realistic, either. It’s going to have to be a combination of fending off the most catastrophic effects and trying to avoid ecological breakdown and adaptation.
Read James Hansen’s paper “Warming in the Pipeline”. He has been warning about this for years. You can expect 4.5 C of warming from a doubling of preindustrial CO2. That’s the end of life of Earth.
Hansen is not a reliable source. When he prepared his testimony for Congress in 1988, he hypothetically assumed two significant volcanic eruptions, which humans have no control over, that were included in his Scenarios B and C (moderate to Draconian CO2 reductions). However, he did NOT apply the cooling effects to Scenario A (Business as Usual), which makes the purposeful reductions appear far better than they actually would be.
There is a story, which I can’t vouch for, that Hansen’s testimony was scheduled for what was historically the hottest day of the year in DC. According to the story, Hansen entered the meeting chambers well before the time scheduled for his testimony and opened all the windows and turned off the air conditioning for the subjective impact the heat would have on the congressmen. That is not the behavior of an objective scientist.
https://wattsupwiththat.com/2018/06/30/analysis-of-james-hansens-1988-prediction-of-global-temperatures-for-the-last-30-years/
Earth has been considerably warmer than present many times in the past. Rather than “the end of life of Earth,” as you claim, the Paleocene-Eocene Thermal Maximum was a time of abundant life and rapid evolution of mammalian life in particular, such as our primate ancestors.
No; life has survived worse. Just end of human civilisation.
The Cradle of Civilization appears to have been in the Middle-East, where the temperatures have apparently always been high. It expanded into the Mediterranean, which is also known for high Summer temperatures, certainly higher than in England and Scandanavia. This was long before the invention of air conditioning. It is easier to survive heat, especially with low-humidity, with low technology than it is to survive the cold of the polar regions, which is why the Arctic has always had a low population density, and humans never established a presence in Antarctic prior to technology. I think that you are being overly pessimistic.
“Hansen entered the meeting chambers well before the time scheduled for his testimony and opened all the windows and turned off the air conditioning for the subjective impact the heat would have on the congressmen. That is not the behavior of an objective scientist.”
When dealing with non-scientific, non-objective politicians and thus scientifically ignorant lawmakers, a bit of showmanship is essential if you want to get their attention. It’s called advertising a product if it involves battered fried chickens laced with herbs and spices .
It sounds to me that you are rationalizing the philosophical position that end end justifies any means. That has been the cause of many religious wars. If a scientist prostitutes themself to the same level as a politician, does that still excuse their behavior if their beliefs are wrong? That gets to the core of the problem. How can anyone be certain that their belief system is correct?
The inadvertent effects of well-meaning cleaner air policies on global warming have been discussed all year and there have been numerous studies already, so it is surprising to see such a blandly-worded reference toward the end of this article… “Lower concentrations of anthropogenic aerosols in the atmosphere, especially due to stricter regulations on marine fuel, are likely a contributing factor. ”
From study entitled “Abrupt reduction in shipping emission as an inadvertent geoengineering termination shock produces substantial radiative warming” : “In 2020, fuel regulations abruptly reduced the emission of sulfur dioxide from international shipping by about 80% and created an inadvertent geoengineering termination shock with global impact.”
For another good article, google “Analysis: How low-sulphur shipping rules are affecting global warming”
Yeah I caught that too. It’s almost like we should go the other way on that one and mandate higher sulfur fuel for ships. The spend very little time around people, even loading and unloading nobody really lives near shipping ports. The rest of the time they are crossing the open ocean. So it really seems like this could be a short term way to mitigate a small amount of warming.
Plus it’s low hanging fruit. When the time comes to act all you have to do is mandate the refineries change their blend of bunker fuel.
I’m no scientist but with hydrogen vehicles that clean the air as they drive, and only emit H20, maybe accelerating the airlines industry to hydrogen, like yesterday?
How do hydrogen vehicles clean the air when they are only removing oxygen? Water vapor has been estimated to be about 2.5X more effective than CO2 as a greenhouse gas. It is generally dismissed as unimportant (In my opinion, incorrectly.) because it has a short residency and is quickly replenished, maintaining an approximately uniform concentration. If CO2 is replaced by H2O, one can expect increased warming, along with undesirable side-effects such as mold, corrosion, and slick roadways. How is that better?
What ever happened to the ecologically friendly “white paint” that was created to reflect the sun’s rays?
Yeah but, but:
We are (gradually?) paving the land surface of this planet with solar panels. Have you noticed what color they are? That would be black, with a real impact on the albedo of the land surface of the earth.
Have you considered where the electrical energy produced by solar panels goes? !00% of it ultimately goes to heat. 100%
And how about windmills? !00% of the energy taken out of the moving air ultimately goes to heat. That includes the electrical energy produced by the generator in the windmill, plus the friction energy losses occurring in the mechanical mechanisms in the windmill plus the friction of the airflow passing over the blades of the windmill. 100%
And how about nuclear reactors (that we are now considering building many more of). Both the inefficient output as well as the efficient output both ultimately go 100% to heat. 100%
So why shouldn’t the earth get warmer than we forecasted?
Bad forecasts.
What is your point? All of the sources of heat you just so diligently added up are going to be at least 3 probably 4 decimals to the right when calculating the Earth’s heat budget .000%. That means it can be safely ignored because it is a non factor.
The problem is the planets inability to radiate heat into space because of the gigatons of garbage we have pumped into the atmosphere.
We are trying to calculate the speed of a locomotive and you’re here telling us to factor in the bugs hitting the windshield!
“The problem is the planets inability to radiate heat into space because of the gigatons of garbage we have pumped into the atmosphere.”
An important correction to your statement: The problem is the planet’s reduced ability to radiate heat created by the absorption of solar radiation that doesn’t get radiated back to space immediately by the reflectivity of clouds and the surface. Thus, the surface ‘albedo’ is important, and it is the result of land use changes that are considered important enough to routinely measure it.