2016 Global Temperature Update to Hansen’s 1981 Projection

It is always useful to check past predictions against eventual observations. Below is the NASA GISTEMP observed global temperature (updated through 2016) overlain on top of various projections of CO2-induced warming from calculations published in 1981 (Hansen et al. 1981). 2015 and 2016 are literally off of the chart. This does not imply higher equilibrium climate sensitivity than that represented by the dashed line (5.6C) because these calculations did not include the effects of anthropogenic increases in non-CO2 greenhouse gasses. There are a number of other important caveats to this juxtaposition like Hansen’s model not allowing for unforced/internal variability as well as differences between the assumed and actual growth rate of atmospheric CO2 ect. Nevertheless, it is an interesting comparison.

2016_update_to_hansen_et_al_1981

Posted in Uncategorized | Leave a comment

2016 update of modeled vs. observed global temperature

NASA released their 2016 global mean surface temperature data today. With this datapoint in, observations are now above the average climate model value for this point in time (using 1986-2005 as the baseline):

screen-shot-2017-01-18-at-8-20-55-am

This graphic uses the RCP 4.5 emissions scenario for the models but the divergence between RCP 4.5 and steeper emissions scenarios is not appreciable until the mid-21st century (see e.g. Figure 1 here).

Posted in Uncategorized | 2 Comments

Why do climate models disagree on the size of global temperature variability?

We have published a new paper titled “Spread in the magnitude of climate model interdecadal global temperature variability traced to disagreements over high-latitude oceans“. Here is a brief summary:

Natural unforced variability in global mean surface air temperature (GMST) is of the same order of magnitude as current externally forced changes in GMST on decadal timescales. Thus, understanding the precise magnitude of unforced GMST variability is relevant for both the attribution of past climate changes to human causes as well to the prediction of climate change on policy-relevant timescales.

Climate models could be useful for estimating the true magnitude of unforced GMST variability provided that they more-or-less converge on the same answer. Unfortunately, current models show substantial disagreement on the magnitude of natural GMST variability, highlighting a key uncertainty in contemporary climate science. This large model spread must be narrowed in the future if we are to have confidence that models can be trusted to give useful insights on natural variability.

Since it is known that unforced GMST variability is heavily influenced by tropical Pacific surface temperatures, it might be tempting to suppose that the large inter-model spread in the simulated magnitude of GMST variability is due to model disagreement in the amount of simulated tropical Pacific variability. Perhaps surprisingly, our study shows that this is not the case and that the spread in the magnitude of model-simulated GMST variability is linked much more strongly to model disagreements over high-latitude oceans. Our findings suggesting that improving the simulation of air-sea interaction in these high-latitude ocean regions could narrow the range of simulated GMST variability, advance our fundamental understanding of natural variability, and appreciably improve our ability to forecast global warming on policy-relevant timescales.

Posted in Climate Change | Leave a comment

Video Summary of my PhD Dissertation

 

Posted in Climate Change | Leave a comment

What do historical temperature records tell us about natural variability in global temperature?

I have published an article, written for a general audience, summarizing the results of our 2015 Scientific Reports study.

Posted in Uncategorized | Leave a comment

Cloud feedback necessary for a basin-scale AMO

Screen Shot 2016-04-11 at 7.49.04 PM

We have recently published a study in Geophysical Research Letters titled “The necessity of cloud feedback for a basin-scale Atlantic Multidecadal Oscillation“.

The Atlantic Multidecadal Oscillation (AMO) – a basin-scale coherent oscillation of sea surface temperatures over the North Atlantic – is thought be one of the climate system’s most important modes of natural variability, affecting everything from drought to hurricane activity to natural fluctuations in global temperature. Traditionally, the basin-scale AMO has been explained as a direct consequence of variability in the Atlantic Ocean’s meridional overturning circulation (AMOC). In contrast, our study identifies atmospheric processes; specifically cloud feedback, as a necessary component for the existence of a basin-scale AMO, thus amending the canonical view of the AMO as a signature directly and solely attributable to oceanic processes.

Posted in Uncategorized | Leave a comment

The stability of unforced global temperature – In plain english

We have new published research that shows in detail why the earth’s temperature remains stable when it is not pushed by outside forcings. Below is a summary in plain english. For a more technical discussion see here.

  • The study is all about what climate does when it is not pushed by what we call external drivers
    • External drivers (or forcings) are things like changes in the amount of energy coming in from the sun or changes in the amount of greenhouse gasses in the atmosphere.
  • You might expect (and many people simply assume) that the climate should be stable when it is not pushed by these external drivers
    • What our study did was investigate this assumption in a lot of detail and it turns out its not quite so simple
  • Why is it not so simple? Many locations on earth experience positive feedbacks between temperature and absorbed energy. For example, if you have some natural warming where there is sea ice, you will melt some of the sea ice, melting the sea ice will cause more solar energy to be absorbed which will cause more warming and more melting. It turns out these types of positive feedbacks are working all over the surface of the planet.
  • So the question then becomes: If the Earth gets warmer naturally due to something like an El-Nino event, what’s stopping it from just continuing to warm? Can it cool itself back down? If so, how?
  • The study looks at this in detail and finds that some very interesting things are going on that allow that Earth to cool itself down after one of these unforced natural warming events:
    • It turns out that the earth naturally transports energy away from locations where there are positive feedbacks to locations where there are negative feedbacks.
    • Also the atmosphere rearranges clouds and water vapor in a way that allow much more energy to escape than we would expect otherwise.
  • These things are scientifically interesting but the bottom line that the general public should understand is that the earth is able to cool itself down after an unforced natural warming event like an el-Niño and thus in order for the earth to have sustained warming over multiple decades to a century, you need these external drivers (or forcings) like the increase in greenhouse gasses. This undermines the popular skeptic idea that the climate just drifts randomly from warm to cold and back again over many decades to centuries in an unpredictable manner.
Posted in Climate Change | Leave a comment

The stability of unforced global temperature – Technical Discussion

Screen Shot 2016-01-17 at 7.25.09 AM

We have new published research that has implications for why global mean surface air temperature (GMT) is stable in the absence of external radiative forcings.

One of the central differences between weather prediction and climate projection is that the former is considered to be an “initial value problem” and the latter is considered to be a “forced boundary condition problem” (1). This dichotomy implies that weather is subject to chaotic variability and thus is fundamentally unpredictable beyond several weeks but climate can be projected into the future with some confidence as long as changes in the boundary conditions of the system are known (2). For GMT, the fundamental boundary conditions are the system’s basic radiative properties, i.e., the incoming solar radiation, planetary albedo, and the atmospheric infrared transitivity governed by greenhouse gas concentrations (3).

In reality, however, the forced boundary condition paradigm is complicated by nonlinear, two-way, interactions within the climate system. In particular, planetary albedo and the greenhouse effect are themselves complex functions of GMT and its spatial distribution (4). Therefore, if GMT is to be projected to within some fairly narrow range for a given change in boundary conditions (i.e., an increase in the greenhouse effect), it must be the case that the climate system can damp any unforced (internally generated) GMT perturbations. This idea becomes clearer when GMT evolution is expressed as a perturbation away from an equilibrium value set by the boundary conditions (ΔT=GMT-GMTequilibrium). ΔT change can be expressed as the sum of forcings (F), feedbacks (λΔT) and heat fluxes between the upper ocean’s mixed layer and the ocean below the mixed layer (Q),

Screen Shot 2016-01-17 at 7.05.26 AM.                                                                                            [1]

In this formulation, F often represents external radiative forcings (e.g., changes in well-mixed greenhouse gasses, aerosol loading, incoming solar radiation, etc.), however, here we are concerned with the stability of ΔT in the absence of external forcings so F represents unforced energy imbalances at the top of the atmosphere (TOA) (5). C is the effective heat capacity of the land/atmosphere/ocean-mixed-layer system, λ is the feedback parameter (the reciprocal of the climate sensitivity parameter) and λΔT represents the radiative fast-feedbacks (positive downward) (6-11).

It is accepted that ΔT should be stable in the long run mostly because of the direct blackbody response of outgoing longwave radiation to ΔT change, which is often referred to as the Planck Response,

Screen Shot 2016-01-17 at 7.05.35 AM                                                                                [2]

where Te is the effective radiating temperature of the Earth (≈255K) and σ is the Stefan-Boltzmann constant (12). The negative sign indicates increased energy loss by the climate system with warming. λPlanck is typically incorporated into λ in [1] as the reference sensitivity, e.g., λ=(1−faPlanck, where fa denotes the feedback factor sum of the fast-feedbacks in the system (i.e., water vapor, lapse rate, surface albedo, and cloud feedbacks) (7). Net positive fast-feedbacks imply fa > 0. Therefore, larger positive fast feedbacks imply a less negative λ and a climate system that is less effective at damping ΔT anomalies. To make this idea more explicit, [1] can be discretized and rearranged to obtain,

Screen Shot 2016-01-17 at 7.05.43 AM                                                                                          [3]

where,                                                                                                                                                                   Screen Shot 2016-01-17 at 7.05.50 AM                                                                                                                               [4]

and

Screen Shot 2016-01-17 at 7.05.56 AM                                                                                               [5]

Now ΔT evolution is explicitly represented as a first-order autoregressive function. In this form, θ is the autoregressive parameter that can be thought of as a quantitative representation of the restoring force, or stability of ΔT. Thus it becomes evident that the relative magnitude of the fast-feedbacks in the system will play a central role in our ability to consider GMT as a forced boundary condition problem. In particular, when θ is positive, but << 1, the system experiences a strong restoring force and ΔT is heavily damped. However, when the feedbacks in the climate system nearly overwhelm the Planck Response (fa à 1, θ à 1), the restoring force for ΔT disappears. With no restoring force, ΔT would be free to evolve in a chaotic and unpredictable manner comparable to Brownian motion or a “random walk” (13). In this case, GMT could be considered to be “intransitive” (14) and it might be better categorized as an initial value problem than as a forced boundary condition problem.

Consequently, most of modern climate science rests critically on the notion that the Planck Response overwhelms positive radiative fast-feedbacks in the climate system. In our paper, however, we document that at the local level, positive fast-feedbacks actually overwhelm the Planck Response over most of the surface of the Earth. The objective of the paper was to investigate how this finding can be reconciled with an apparently stable GMT at the global spatial scale.

We resolved this apparent paradox by showing that an anomalously warm Earth tends to restore equilibrium in complex and previously unappreciated ways. Our study shows in detail why global temperature should be stable in the absence of external forcings and therefore why global temperature does not evolve chaotically in the long run. Therefore this work explains why large, sustained, changes in global temperature require external radiative forcings like increases in greenhouse gas concentrations.

We focused our analysis on 27 Atmosphere-Ocean General Circulation Models (AOGCMs) from the Coupled Model Intercomparison Project – Phase 5 (CMIP5) (15). We utilize unforced preindustrial control runs which, by definition, included no external radiative forcings and thus all variability emerged spontaneously from the internal dynamics of the modeled climate system. We used the first 200 years of each AOGCMs preindustrial control run and we linearly detrended all analyzed variables so that our analysis was not contaminated with possibly unphysical model drift that may have been a result of insufficient model spin-up. Because this detrending procedure forced the AOGCM runs to be stable over the 200-year period, we are implicitly studying the restoring force for ΔT relative to any 200-year trend present in the control runs. Consequently, we are limited to studying the physical explanation for the stability of ΔT at timescales smaller than 200-years.

For more information see the AGU poster or the paper:

Brown, P.T., W. Li, J.H. Jiang, H. Su (2016) Unforced surface air temperature variability and its contrasting relationship with the anomalous TOA energy flux at local and global spatial scales. Journal of Climate, doi:10.1175/JCLI-D-15-0384.1

References:

  1. Kirtman et al. (2013) Near-term Climate Change: Projections and Predictability. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA).
  2. Hawkins E & Sutton R (2009) The Potential to Narrow Uncertainty in Regional Climate Predictions. Bulletin of the American Meteorological Society 90(8):1095-1107.
  3. Sagan C & Mullen G (1972) Earth and Mars: Evolution of Atmospheres and Surface Temperatures. Science 177(4043):52-56.
  4. Armour KC, Bitz CM, & Roe GH (2012) Time-Varying Climate Sensitivity from Regional Feedbacks. Journal of Climate 26(13):4518-4534.
  5. Brown PT, Li W, Li L, & Ming Y (2014) Top-of-Atmosphere Radiative Contribution to Unforced Decadal Global Temperature Variability in Climate Models. Geophysical Research Letters:2014GL060625.
  6. Wigley TML & Schlesinger ME (1985) Analytical solution for the effect of increasing CO2 on global mean temperature. Nature 315(6021):649-652.
  7. Baker MB & Roe GH (2009) The Shape of Things to Come: Why Is Climate Change So Predictable? Journal of Climate 22(17):4574-4589.
  8. Geoffroy O, et al. (2012) Transient Climate Response in a Two-Layer Energy-Balance Model. Part I: Analytical Solution and Parameter Calibration Using CMIP5 AOGCM Experiments. Journal of Climate 26(6):1841-1857.
  9. Held IM, et al. (2010) Probing the Fast and Slow Components of Global Warming by Returning Abruptly to Preindustrial Forcing. Journal of Climate 23(9):2418-2427.
  10. Wigley TML & Raper SCB (1990) Natural variability of the climate system and detection of the greenhouse effect. Nature 344(6264):324-327.
  11. Dickinson RE (1981) Convergence Rate and Stability of Ocean-Atmosphere Coupling Schemes with a Zero-Dimensional Climate Model. Journal of the Atmospheric Sciences 38(10):2112-2120.
  12. Hansen J. AL, D. Rind, G. Russell,P. Stone , et al. (1984) Climate Sensitivity: analysis of feedback mechanisms. . in Climate Processes and Climate Sensitivity, ed. JE Hansen, T Takahashi, Geophys. Monogr (Washington, DC: Am. Geophys. Union), pp 130-163.
  13. Hasselmann K (1976) Stochastic climate models Part I. Theory. Tellus 28(6):473-485.
  14. Lorenz E (1968) Climatic Determinism. Meteor. Monographs, Amer. Meteor. Soc. 25:1-3.
  15. Taylor KE, Stouffer RJ, & Meehl GA (2011) An Overview of CMIP5 and the Experiment Design. Bulletin of the American Meteorological Society 93(4):485-498.
Posted in Climate Change | Leave a comment

2015 Record Warmth: Update to Our Recent Analysis

This is an update to our 2015 Scientific Reports paper: Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise. The paper used a novel statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records. We used our statistical estimate of unforced variability to aid in our interpretation of recently observed temperature variability (more info here).

Our paper used global temperature data through 2013 since that was the most recent year in the major global temperature datasets at the time that the paper was submitted. Below I update Figures 2 and 3 from the paper, incorporating the back-to-back record breaking warmth of 2014 and 2015.

Screen Shot 2016-06-13 at 10.07.30 AM

 

 

Figure 2 updated to include 2014 and 2015.

SciRep 2015 update Fig 3

Figure 3 updated to include 2014 and 2015.

The summary section of our paper stated:

We find that the interdecadal variability in the rate of global warming over the 20th century (i.e., acceleration from ~1910–1940, deceleration until ~1975, acceleration until ~2000) is within the 2.5–97.5% EUN, even if the forced signal is represented as a linear trend, indicating that this observed interdecadal variability in the rate of warming does not necessarily require interdecadal variability in the rate-of-increase of the forced signal.

This statement was about 20th century temperature and thus updates for 2014 and 2015 are somewhat irrelevant. Nevertheless, the updated Figure 2 (bottom left panel) indicates that recent warmth is just now starting to emerge from a linear-trend null hypothesis. This is not to say that a linear trend is the most likely representation of the forced component of variability – it just means that the linear trend forced component can’t quite be ruled out. This is now starting to change as observations move above the 97.5th percentile of the unforced range.

The summary section also stated:

We also find that recently observed GMT values, as well as trends, are near the lower bounds of the EUN for a forced signal corresponding to the RCP 8.5 emissions scenario but that observations are not inconsistent with a forced signal corresponding to the RCP 6.0 emissions scenario.

Note that we were not making a forecast about how likely the RCP 8.5 emissions scenario was. Instead, we were using the the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring over the recent past (see here and here for further clarification).

The Figure 3 indicates that with the updated data, no trend over the past 25 years falls outside of the 5-95% range for any of the scenarios. The trends over the most recent ~5 years are higher than average for all the scenarios but still well within the range of unforced variability. Over the past 10-20 years, observed trends have been on the lower end of the RCP 8.5 range but closer to the middle of the RCP6.0 range. This indicates that over the past 10-20 years it may be more likely that we have been on a RCP6.0-like warming trajectory than a RCP8.5-like warming trajectory. This is similar to the conclusion of the original study.

Posted in Climate Change | Leave a comment

2015 Global Temperature vs. Models

2015 was the warmest year in the instrumental record (dating back to the mid/late 19th century) in all the major surface temperature datasets including NASA’s GISTEMP:

Screen Shot 2016-01-20 at 8.27.25 PM.pngHowever, 2015 still falls below the CMIP5 climate model mean value (left panel below). The difference between observations and the mean value from climate models  is often used as an estimate of the ‘unforced’ or ‘internal’ variability in global temperature (right panel blow). It is apparent from this estimate that there was an unforced cooling event from ~1998 to ~2013. Thus the 2015 record temperature does not ‘erase’ the hiatus – it is totally legitimate to study why observations diverged from the model mean over this time period.

Screen Shot 2016-01-20 at 8.26.22 PM.png

Because of the on-going El Nino event, 2016 will likely be even warmer than 2015 and thus 2016 may be above the climate model mean value for the 1st time since 1998. It will be very interesting to see what happens in 2017 and 2018. When neutral or La-Nina conditions return, will observations keep up with the steep rate of warming predicted by climate models?

Posted in Uncategorized | Leave a comment