Bjørn H. Samset
Among the factors that affect the climate, few are as diverse and challenging to understand as aerosols. Minute particles suspended in the atmosphere, aerosols are emitted through a wide range of natural and industrial processes, and are transported around the globe by winds and weather. Once airborne, they affect the climate both directly, through scattering and absorption of solar radiation, and indirectly, through their impact on cloud properties. Combining all their effects, anthropogenic changes to aerosol concentrations are estimated to have had a climate impact over the industrial era that is second only to CO2. Their atmospheric lifetime of only a few days, however, makes their climate effects substantially different from those of well-mixed greenhouse gases.
Major aerosol types include sea salt, dust, sulfate compounds, and black carbon—or soot—from incomplete combustion. Of these, most scatter incoming sunlight back to space, and thus mainly cool the climate. Black carbon, however, absorbs sunlight, and therefore acts as a heating agent much like a greenhouse gas. Furthermore, aerosols can act as cloud condensation nuclei, causing clouds to become whiter—and thus more reflecting—further cooling the surface. Black carbon is again a special case, acting to change the stability of the atmosphere through local heating of the upper air, and also changing the albedo of the surface when it is deposited on snow and ice, for example.
The wide range of climate interactions that aerosols have, and the fact that their distribution depends on the weather at the time and location of emission, lead to large uncertainties in the scientific assessment of their impact. This in turn leads to uncertainties in our present understanding of the climate sensitivity, because while aerosols have predominantly acted to oppose 20th-century global warming by greenhouse gases, the magnitude of aerosol effects on climate is highly uncertain.
Finally, aerosols are important for large-scale climate events such as major volcanoes, or the threat of nuclear winter. The relative ease with which they can be produced and distributed has led to suggestions for using targeted aerosol emissions to counteract global warming—so-called climate engineering.
Stefano Tibaldi and Franco Molteni
The atmospheric circulation in the mid-latitudes of both hemispheres is usually dominated by westerly winds and by planetary-scale and shorter-scale synoptic waves, moving mostly from west to east. A remarkable and frequent exception to this “usual” behavior is atmospheric blocking. Blocking occurs when the usual zonal flow is hindered by the establishment of a large-amplitude, quasi-stationary, high-pressure meridional circulation structure which “blocks” the flow of the westerlies and the progression of the atmospheric waves and disturbances embedded in them. Such blocking structures can have lifetimes varying from a few days to several weeks in the most extreme cases. Their presence can strongly affect the weather of large portions of the mid-latitudes, leading to the establishment of anomalous meteorological conditions. These can take the form of strong precipitation episodes or persistent anticyclonic regimes, leading in turn to floods, extreme cold spells, heat waves, or short-lived droughts. Even air quality can be strongly influenced by the establishment of atmospheric blocking, with episodes of high concentrations of low-level ozone in summer and of particulate matter and other air pollutants in winter, particularly in highly populated urban areas.
Atmospheric blocking has the tendency to occur more often in winter and in certain longitudinal quadrants, notably the Euro-Atlantic and the Pacific sectors of the Northern Hemisphere. In the Southern Hemisphere, blocking episodes are generally less frequent, and the longitudinal localization is less pronounced than in the Northern Hemisphere.
Blocking has aroused the interest of atmospheric scientists since the middle of the last century, with the pioneering observational works of Berggren, Bolin, Rossby, and Rex, and has become the subject of innumerable observational and theoretical studies. The purpose of such studies was originally to find a commonly accepted structural and phenomenological definition of atmospheric blocking. The investigations went on to study blocking climatology in terms of the geographical distribution of its frequency of occurrence and the associated seasonal and inter-annual variability. Well into the second half of the 20th century, a large number of theoretical dynamic works on blocking formation and maintenance started appearing in the literature. Such theoretical studies explored a wide range of possible dynamic mechanisms, including large-amplitude planetary-scale wave dynamics, including Rossby wave breaking, multiple equilibria circulation regimes, large-scale forcing of anticyclones by synoptic-scale eddies, finite-amplitude non-linear instability theory, and influence of sea surface temperature anomalies, to name but a few. However, to date no unique theoretical model of atmospheric blocking has been formulated that can account for all of its observational characteristics.
When numerical, global short- and medium-range weather predictions started being produced operationally, and with the establishment, in the late 1970s and early 1980s, of the European Centre for Medium-Range Weather Forecasts, it quickly became of relevance to assess the capability of numerical models to predict blocking with the correct space-time characteristics (e.g., location, time of onset, life span, and decay). Early studies showed that models had difficulties in correctly representing blocking as well as in connection with their large systematic (mean) errors.
Despite enormous improvements in the ability of numerical models to represent atmospheric dynamics, blocking remains a challenge for global weather prediction and climate simulation models. Such modeling deficiencies have negative consequences not only for our ability to represent the observed climate but also for the possibility of producing high-quality seasonal-to-decadal predictions. For such predictions, representing the correct space-time statistics of blocking occurrence is, especially for certain geographical areas, extremely important.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Climate Science. Please check back later for the full article.
In the mid-1950s, the geophysicist Norman Phillips computed a general circulation model on John von Neumann’s IAS computer at the Institute for Advanced Studies (IAS) at Princeton. His two-level quasi-geostrophic model predicted the main global circulation patterns for one hemisphere and the poleward transport of energy. Phillips’ computations are considered to be the very first climate simulation and the crucial experiment for testifying that numerical flow can represent large-scale dynamic patterns of the atmosphere. It is high-speed computing, as Phillips pointed out in his conclusion, which will overcome the main obstacle of meteorology, namely the difficulty of solving the nonlinear hydrodynamic equations. Thus, computations will advance the physical understanding of the atmosphere.
Simulation as the numerical approach to scientific problems requires not only high-speed computing, but also a view of meteorology as dynamical meteorology, which was developed in the late 19th and early 20th centuries. Originating as a way to address the problem of weather forecasting, the dynamical approach turned meteorology into the physics of the atmosphere. This view detached the experience of climates into average weather, defined by the World Meteorological Organization as the mean and variability of relevant quantities of variables such as temperature, precipitation, or wind over a period of time of at least 30 years. Today, simulated climate has become a prominent topic in public discourse due to the environmental and societal problem of anthropogenic climate change. However, understanding climate and simulation requires understanding the three major transformations of meteorology: from weather to climate, from synopsis to numerics, and from measurements to projections.
Climate and carbon cycle are tightly coupled on many time scales, from the interannual to the multimillennial. Observation always shows a positive feedback between climate and the carbon cycle: elevated atmospheric CO2 leads to warming, but warming is expected to further release of carbon to the atmosphere, enhancing the atmospheric CO2 increase. Earth system models do represent these climate–carbon cycle feedbacks, always simulating a positive feedback over the 21st century; that is, climate change will lead to loss of carbon from the land and ocean reservoirs. These processes partially offset the increases in land and ocean carbon sinks caused by rising atmospheric CO2. As a result, more of the emitted anthropogenic CO2 will remain in the atmosphere. There is, however, a large uncertainty on the magnitude of this feedback. Recent studies now help to reduce this uncertainty. On short, interannual, time scales, El Niño years record larger-than-average atmospheric CO2 growth rate, with tropical land ecosystems being the main drivers. These climate–carbon cycle anomalies can be used as emerging constraint on the tropical land carbon response to future climate change. On a longer, centennial, time scale, the variability of atmospheric CO2 found in records of the last millennium can be used to constrain the overall global carbon cycle response to climate. These independent methods confirm that the climate–carbon cycle feedback is positive, but probably more consistent with the lower end of the comprehensive models range, excluding very large climate–carbon cycle feedbacks.
Kerry H. Cook
Accurate projections of climate change under increasing atmospheric greenhouse gas levels are needed to evaluate the environmental cost of anthropogenic emissions, and to guide mitigation efforts. These projections are nowhere more important than Africa, with its high dependence on rain-fed agriculture and, in many regions, limited resources for adaptation. Climate models provide our best method for climate prediction but there are uncertainties in projections, especially on regional space scale. In Africa, limitations of observational networks add to this uncertainty since a crucial step in improving model projections is comparisons with observations. Exceeding uncertainties associated with climate model simulation are uncertainties due to projections of future emissions of CO2 and other greenhouse gases. Humanity’s choices in emissions pathways will have profound effects on climate, especially after the mid-century.
The African Sahel is a transition zone characterized by strong meridional precipitation and temperature gradients. Over West Africa, the Sahel marks the northernmost extent of the West African monsoon system. The region’s climate is known to be sensitive to sea surface temperatures, both regional and global, as well as to land surface conditions. Increasing atmospheric greenhouse gases are already causing amplified warming over the Sahara Desert and, consequently, increased rainfall in parts of the Sahel. Climate model projections indicate that much of this increased rainfall will be delivered in the form of more intense storm systems.
The complicated and highly regional precipitation regimes of East Africa present a challenge for climate modeling. Within roughly 5º of latitude of the equator, rainfall is delivered in two seasons—the long rains in the spring, and the short rains in the fall. Regional climate model projections suggest that the long rains will weaken under greenhouse gas forcing, and the short rains season will extend farther into the winter months. Observations indicate that the long rains are already weakening.
Changes in seasonal rainfall over parts of subtropical southern Africa are observed, with repercussions and challenges for agriculture and water availability. Some elements of these observed changes are captured in model simulations of greenhouse gas-induced climate change, especially an early demise of the rainy season. The projected changes are quite regional, however, and more high-resolution study is needed. In addition, there has been very limited study of climate change in the Congo Basin and across northern Africa. Continued efforts to understand and predict climate using higher-resolution simulation must be sustained to better understand observed and projected changes in the physical processes that support African precipitation systems as well as the teleconnections that communicate remote forcings into the continent.
Storms are characterized by high wind speeds; often large precipitation amounts in the form of rain, freezing rain, or snow; and thunder and lightning (for thunderstorms). Many different types exist, ranging from tropical cyclones and large storms of the midlatitudes to small polar lows, Medicanes, thunderstorms, or tornadoes. They can lead to extreme weather events like storm surges, flooding, high snow quantities, or bush fires. Storms often pose a threat to human lives and property, agriculture, forestry, wildlife, ships, and offshore and onshore industries. Thus, it is vital to gain knowledge about changes in storm frequency and intensity. Future storm predictions are important, and they depend to a great extent on the evaluation of changes in wind statistics of the past.
To obtain reliable statistics, long and homogeneous time series over at least some decades are needed. However, wind measurements are frequently influenced by changes in the synoptic station, its location or surroundings, instruments, and measurement practices. These factors deteriorate the homogeneity of wind records. Storm indexes derived from measurements of sea-level pressure are less prone to such changes, as pressure does not show very much spatial variability as wind speed does. Long-term historical pressure measurements exist that enable us to deduce changes in storminess for more than the last 140 years. But storm records are not just compiled from measurement data; they also may be inferred from climate model data.
The first numerical weather forecasts were performed in the 1950s. These served as a basis for the development of atmospheric circulation models, which were the first generation of climate models or general-circulation models. Soon afterward, model data was analyzed for storm events and cyclone-tracking algorithms were programmed. Climate models nowadays have reached high resolution and reliability and can be run not just for the past, but also for future emission scenarios which return possible future storm activity.
Forecasting severe convective weather remains one of the most challenging tasks facing operational meteorology today, especially in the mid-latitudes, where severe convective storms occur most frequently and with the greatest impact. The forecast difficulties reflect, in part, the many different atmospheric processes of which severe thunderstorms are a by-product. These processes occur over a wide range of spatial and temporal scales, some of which are poorly understood and/or are inadequately sampled by observational networks. Therefore, anticipating the development and evolution of severe thunderstorms will likely remain an integral part of national and local forecasting efforts well into the future.
Modern severe weather forecasting began in the 1940s, primarily employing the pattern recognition approach throughout the 1950s and 1960s. Substantial changes in forecast approaches did not come until much later, however, beginning in the 1980s. By the start of the new millennium, significant advances in the understanding of the physical mechanisms responsible for severe weather enabled forecasts of greater spatial and temporal detail. At the same time, technological advances made available model thermodynamic and wind profiles that supported probabilistic forecasts of severe weather threats.
This article provides an updated overview of operational severe local storm forecasting, with emphasis on present-day understanding of the mesoscale processes responsible for severe convective storms, and the application of recent technological developments that have revolutionized some aspects of severe weather forecasting. The presentation, nevertheless, notes that increased understanding and enhanced computer sophistication are not a substitute for careful diagnosis of the current meteorological environment and an ingredients-based approach to anticipating changes in that environment; these techniques remain foundational to successful forecasts of tornadoes, large hail, damaging wind, and flash flooding.
R. J. Trapp
Cumulus clouds are pervasive on earth, and play important roles in the transfer of energy through the atmosphere. Under certain conditions, shallow, nonprecipitating cumuli may grow vertically to occupy a significant depth of the troposphere, and subsequently may evolve into convective storms.
The qualifier “convective” implies that the storms have vertical accelerations that are driven primarily, though not exclusively, by buoyancy over a deep layer. Such buoyancy in the atmosphere arises from local density variations relative to some base state density; the base state is typically idealized as a horizontal average over a large area, which is also considered the environment. Quantifications of atmospheric buoyancy are typically expressed in terms of temperature and humidity, and allow for an assessment of the likelihood that convective clouds will form or initiate. Convection initiation is intimately linked to existence of a mechanism by which air is vertically lifted to realize this buoyancy and thus accelerations. Weather fronts and orography are the canonical lifting mechanisms.
As modulated by an ambient or environmental distribution of temperature, humidity, and wind, weather fronts also facilitate the transition of convective clouds into storms with locally heavy rain, lightning, and other possible hazards. For example, in an environment characterized by winds that are weak and change little with distance above the ground, the storms tend to be short lived and benign. The structure of the vertical drafts and other internal storm processes under weak wind shear—i.e., a small change in the horizontal wind over some vertical distance—are distinct relative to those when the environmental wind shear is strong. In particular, strong wind shear in combination with large buoyancy favors the development of squall lines and supercells, both of which are highly coherent storm types. Besides having durations that may exceed a few hours, both of these storm types tend to be particularly hazardous: squall lines are most apt to generate swaths of damaging “straight-line” winds, and supercells spawn the most intense tornadoes and are responsible for the largest hail. Methods used to predict convective-storm hazards capitalize on this knowledge of storm formation and development.
The Sahel of Africa has been identified as having the strongest land–atmosphere (L/A) interactions on Earth. The Sahelian L/A interaction studies started in the late 1970s. However, due to controversies surrounding the early studies, in which only a single land parameter was considered in L/A interactions, the credibility of land-surface effects on the Sahel’s climate has long been challenged. Using general circulation models and regional climate models coupled with biogeophysical and dynamic vegetation models as well as applying analyses of satellite-derived data, field measurements, and assimilation data, the effects of land-surface processes on West African monsoon variability, which dominates the Sahel climate system at intraseasonal, seasonal, interannual, and decadal scales, as well as mesoscale, have been extensively investigated to realistically explore the Sahel L/A interaction: its effects and the mechanisms involved.
The Sahel suffered the longest and most severe drought on the planet in the 20th century. The devastating environmental and socioeconomic consequences resulting from drought-induced famines in the Sahel have provided strong motivation for the scientific community and society to understand the causes of the drought and its impact. It was controversial and under debate whether the drought was a natural process, mainly induced by sea-surface temperature variability, or was affected by anthropogenic activities. Diagnostic and modeling studies of the sea-surface temperature have consistently demonstrated it exerts great influence on the Sahel climate system, but sea-surface temperature is unable to explain the full scope of the Sahel climate variability and the later 20th century’s drought. The effect of land-surface processes, especially land-cover and land-use change, on the drought have also been extensively investigated. The results with more realistic land-surface models suggest land processes are a first-order contributor to the Sahel climate and to its drought during the later 1960s to the 1980s, comparable to sea surface temperature effects. The issues that caused controversies in the early studies have been properly addressed in the studies with state-of-the-art models and available data.
The mechanisms through which land processes affect the atmosphere are also elucidated in a number of studies. Land-surface processes not only affect vertical transfer of radiative fluxes and heat fluxes but also affect horizontal advections through their effect on the atmospheric heating rate and moisture flux convergence/divergence as well as horizontal temperature gradients.
Florian Sévellec and Bablu Sinha
The Atlantic meridional overturning circulation (AMOC) is a large, basin-scale circulation located in the Atlantic Ocean that transports climatically important quantities of heat northward. It can be described schematically as a northward flow in the warm upper ocean and a southward return flow at depth in much colder water. The heat capacity of a layer of 2 m of seawater is equivalent to that of the entire atmosphere; therefore, ocean heat content dominates Earth’s energy storage. For this reason and because of the AMOC’s typically slow decadal variations, the AMOC regulates North Atlantic climate and contributes to the relatively mild climate of Europe. Hence, predicting AMOC variations is crucial for predicting climate variations in regions bordering the North Atlantic. Similar to weather predictions, climate predictions are based on numerical simulations of the climate system. However, providing accurate predictions on such long timescales is far from straightforward. Even in a perfect model approach, where biases between numerical models and reality are ignored, the chaotic nature of AMOC variability (i.e., high sensitivity to initial conditions) is a significant source of uncertainty, limiting its accurate prediction.
Predictability studies focus on factors determining our ability to predict the AMOC rather than actual predictions. To this end, processes affecting AMOC predictability can be separated into two categories: processes acting as a source of predictability (periodic harmonic oscillations, for instance) and processes acting as a source of uncertainty (small errors that grow and significantly modify the outcome of numerical simulations). To understand the former category, harmonic modes of variability or precursors of AMOC variations are identified. On the other hand, in a perfect model approach, the sources of uncertainty are characterized by the spread of numerical simulations differentiated by the application of small differences to their initial conditions. Two alternative and complementary frameworks have arisen to investigate this spread. The pragmatic framework corresponds to performing an ensemble of simulations, by imposing a randomly chosen small error on the initial conditions of individual simulations. This allows a probabilistic approach and to statistically characterize the importance of the initial condition by evaluating the spread of the ensemble. The theoretical framework uses stability analysis to identify small perturbations to the initial conditions, which are conducive to significant disruption of the AMOC.
Beyond these difficulties in assessing the predictability, decadal prediction systems have been developed and tested through a range of hindcasts. The inherent difficulties of operational forecasts span from developing efficient initialization methods to setting accurate radiative forcing to correcting for model drift and bias, all these improvements being estimated and validated through a range of specifically designed skill metrics.