Cloud feedback necessary for a basin-scale AMO

Screen Shot 2016-04-11 at 7.49.04 PM

We have recently published a study in Geophysical Research Letters titled “The necessity of cloud feedback for a basin-scale Atlantic Multidecadal Oscillation“.

The Atlantic Multidecadal Oscillation (AMO) – a basin-scale coherent oscillation of sea surface temperatures over the North Atlantic – is thought be one of the climate system’s most important modes of natural variability, affecting everything from drought to hurricane activity to natural fluctuations in global temperature. Traditionally, the basin-scale AMO has been explained as a direct consequence of variability in the Atlantic Ocean’s meridional overturning circulation (AMOC). In contrast, our study identifies atmospheric processes; specifically cloud feedback, as a necessary component for the existence of a basin-scale AMO, thus amending the canonical view of the AMO as a signature directly and solely attributable to oceanic processes.

Posted in Uncategorized | Leave a comment

The stability of unforced global temperature – In plain english

We have new published research that shows in detail why the earth’s temperature remains stable when it is not pushed by outside forcings. Below is a summary in plain english. For a more technical discussion see here.

  • The study is all about what climate does when it is not pushed by what we call external drivers
    • External drivers (or forcings) are things like changes in the amount of energy coming in from the sun or changes in the amount of greenhouse gasses in the atmosphere.
  • You might expect (and many people simply assume) that the climate should be stable when it is not pushed by these external drivers
    • What our study did was investigate this assumption in a lot of detail and it turns out its not quite so simple
  • Why is it not so simple? Many locations on earth experience positive feedbacks between temperature and absorbed energy. For example, if you have some natural warming where there is sea ice, you will melt some of the sea ice, melting the sea ice will cause more solar energy to be absorbed which will cause more warming and more melting. It turns out these types of positive feedbacks are working all over the surface of the planet.
  • So the question then becomes: If the Earth gets warmer naturally due to something like an El-Nino event, what’s stopping it from just continuing to warm? Can it cool itself back down? If so, how?
  • The study looks at this in detail and finds that some very interesting things are going on that allow that Earth to cool itself down after one of these unforced natural warming events:
    • It turns out that the earth naturally transports energy away from locations where there are positive feedbacks to locations where there are negative feedbacks.
    • Also the atmosphere rearranges clouds and water vapor in a way that allow much more energy to escape than we would expect otherwise.
  • These things are scientifically interesting but the bottom line that the general public should understand is that the earth is able to cool itself down after an unforced natural warming event like an el-Niño and thus in order for the earth to have sustained warming over multiple decades to a century, you need these external drivers (or forcings) like the increase in greenhouse gasses. This undermines the popular skeptic idea that the climate just drifts randomly from warm to cold and back again over many decades to centuries in an unpredictable manner.
Posted in Climate Change | Leave a comment

The stability of unforced global temperature – Technical Discussion

Screen Shot 2016-01-17 at 7.25.09 AM

We have new published research that has implications for why global mean surface air temperature (GMT) is stable in the absence of external radiative forcings.

One of the central differences between weather prediction and climate projection is that the former is considered to be an “initial value problem” and the latter is considered to be a “forced boundary condition problem” (1). This dichotomy implies that weather is subject to chaotic variability and thus is fundamentally unpredictable beyond several weeks but climate can be projected into the future with some confidence as long as changes in the boundary conditions of the system are known (2). For GMT, the fundamental boundary conditions are the system’s basic radiative properties, i.e., the incoming solar radiation, planetary albedo, and the atmospheric infrared transitivity governed by greenhouse gas concentrations (3).

In reality, however, the forced boundary condition paradigm is complicated by nonlinear, two-way, interactions within the climate system. In particular, planetary albedo and the greenhouse effect are themselves complex functions of GMT and its spatial distribution (4). Therefore, if GMT is to be projected to within some fairly narrow range for a given change in boundary conditions (i.e., an increase in the greenhouse effect), it must be the case that the climate system can damp any unforced (internally generated) GMT perturbations. This idea becomes clearer when GMT evolution is expressed as a perturbation away from an equilibrium value set by the boundary conditions (ΔT=GMT-GMTequilibrium). ΔT change can be expressed as the sum of forcings (F), feedbacks (λΔT) and heat fluxes between the upper ocean’s mixed layer and the ocean below the mixed layer (Q),

Screen Shot 2016-01-17 at 7.05.26 AM.                                                                                            [1]

In this formulation, F often represents external radiative forcings (e.g., changes in well-mixed greenhouse gasses, aerosol loading, incoming solar radiation, etc.), however, here we are concerned with the stability of ΔT in the absence of external forcings so F represents unforced energy imbalances at the top of the atmosphere (TOA) (5). C is the effective heat capacity of the land/atmosphere/ocean-mixed-layer system, λ is the feedback parameter (the reciprocal of the climate sensitivity parameter) and λΔT represents the radiative fast-feedbacks (positive downward) (6-11).

It is accepted that ΔT should be stable in the long run mostly because of the direct blackbody response of outgoing longwave radiation to ΔT change, which is often referred to as the Planck Response,

Screen Shot 2016-01-17 at 7.05.35 AM                                                                                [2]

where Te is the effective radiating temperature of the Earth (≈255K) and σ is the Stefan-Boltzmann constant (12). The negative sign indicates increased energy loss by the climate system with warming. λPlanck is typically incorporated into λ in [1] as the reference sensitivity, e.g., λ=(1−faPlanck, where fa denotes the feedback factor sum of the fast-feedbacks in the system (i.e., water vapor, lapse rate, surface albedo, and cloud feedbacks) (7). Net positive fast-feedbacks imply fa > 0. Therefore, larger positive fast feedbacks imply a less negative λ and a climate system that is less effective at damping ΔT anomalies. To make this idea more explicit, [1] can be discretized and rearranged to obtain,

Screen Shot 2016-01-17 at 7.05.43 AM                                                                                          [3]

where,                                                                                                                                                                   Screen Shot 2016-01-17 at 7.05.50 AM                                                                                                                               [4]

and

Screen Shot 2016-01-17 at 7.05.56 AM                                                                                               [5]

Now ΔT evolution is explicitly represented as a first-order autoregressive function. In this form, θ is the autoregressive parameter that can be thought of as a quantitative representation of the restoring force, or stability of ΔT. Thus it becomes evident that the relative magnitude of the fast-feedbacks in the system will play a central role in our ability to consider GMT as a forced boundary condition problem. In particular, when θ is positive, but << 1, the system experiences a strong restoring force and ΔT is heavily damped. However, when the feedbacks in the climate system nearly overwhelm the Planck Response (fa à 1, θ à 1), the restoring force for ΔT disappears. With no restoring force, ΔT would be free to evolve in a chaotic and unpredictable manner comparable to Brownian motion or a “random walk” (13). In this case, GMT could be considered to be “intransitive” (14) and it might be better categorized as an initial value problem than as a forced boundary condition problem.

Consequently, most of modern climate science rests critically on the notion that the Planck Response overwhelms positive radiative fast-feedbacks in the climate system. In our paper, however, we document that at the local level, positive fast-feedbacks actually overwhelm the Planck Response over most of the surface of the Earth. The objective of the paper was to investigate how this finding can be reconciled with an apparently stable GMT at the global spatial scale.

We resolved this apparent paradox by showing that an anomalously warm Earth tends to restore equilibrium in complex and previously unappreciated ways. Our study shows in detail why global temperature should be stable in the absence of external forcings and therefore why global temperature does not evolve chaotically in the long run. Therefore this work explains why large, sustained, changes in global temperature require external radiative forcings like increases in greenhouse gas concentrations.

We focused our analysis on 27 Atmosphere-Ocean General Circulation Models (AOGCMs) from the Coupled Model Intercomparison Project – Phase 5 (CMIP5) (15). We utilize unforced preindustrial control runs which, by definition, included no external radiative forcings and thus all variability emerged spontaneously from the internal dynamics of the modeled climate system. We used the first 200 years of each AOGCMs preindustrial control run and we linearly detrended all analyzed variables so that our analysis was not contaminated with possibly unphysical model drift that may have been a result of insufficient model spin-up. Because this detrending procedure forced the AOGCM runs to be stable over the 200-year period, we are implicitly studying the restoring force for ΔT relative to any 200-year trend present in the control runs. Consequently, we are limited to studying the physical explanation for the stability of ΔT at timescales smaller than 200-years.

For more information see the AGU poster or the paper:

Brown, P.T., W. Li, J.H. Jiang, H. Su (2016) Unforced surface air temperature variability and its contrasting relationship with the anomalous TOA energy flux at local and global spatial scales. Journal of Climate, doi:10.1175/JCLI-D-15-0384.1

References:

  1. Kirtman et al. (2013) Near-term Climate Change: Projections and Predictability. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA).
  2. Hawkins E & Sutton R (2009) The Potential to Narrow Uncertainty in Regional Climate Predictions. Bulletin of the American Meteorological Society 90(8):1095-1107.
  3. Sagan C & Mullen G (1972) Earth and Mars: Evolution of Atmospheres and Surface Temperatures. Science 177(4043):52-56.
  4. Armour KC, Bitz CM, & Roe GH (2012) Time-Varying Climate Sensitivity from Regional Feedbacks. Journal of Climate 26(13):4518-4534.
  5. Brown PT, Li W, Li L, & Ming Y (2014) Top-of-Atmosphere Radiative Contribution to Unforced Decadal Global Temperature Variability in Climate Models. Geophysical Research Letters:2014GL060625.
  6. Wigley TML & Schlesinger ME (1985) Analytical solution for the effect of increasing CO2 on global mean temperature. Nature 315(6021):649-652.
  7. Baker MB & Roe GH (2009) The Shape of Things to Come: Why Is Climate Change So Predictable? Journal of Climate 22(17):4574-4589.
  8. Geoffroy O, et al. (2012) Transient Climate Response in a Two-Layer Energy-Balance Model. Part I: Analytical Solution and Parameter Calibration Using CMIP5 AOGCM Experiments. Journal of Climate 26(6):1841-1857.
  9. Held IM, et al. (2010) Probing the Fast and Slow Components of Global Warming by Returning Abruptly to Preindustrial Forcing. Journal of Climate 23(9):2418-2427.
  10. Wigley TML & Raper SCB (1990) Natural variability of the climate system and detection of the greenhouse effect. Nature 344(6264):324-327.
  11. Dickinson RE (1981) Convergence Rate and Stability of Ocean-Atmosphere Coupling Schemes with a Zero-Dimensional Climate Model. Journal of the Atmospheric Sciences 38(10):2112-2120.
  12. Hansen J. AL, D. Rind, G. Russell,P. Stone , et al. (1984) Climate Sensitivity: analysis of feedback mechanisms. . in Climate Processes and Climate Sensitivity, ed. JE Hansen, T Takahashi, Geophys. Monogr (Washington, DC: Am. Geophys. Union), pp 130-163.
  13. Hasselmann K (1976) Stochastic climate models Part I. Theory. Tellus 28(6):473-485.
  14. Lorenz E (1968) Climatic Determinism. Meteor. Monographs, Amer. Meteor. Soc. 25:1-3.
  15. Taylor KE, Stouffer RJ, & Meehl GA (2011) An Overview of CMIP5 and the Experiment Design. Bulletin of the American Meteorological Society 93(4):485-498.
Posted in Climate Change | Leave a comment

2015 Record Warmth: Update to Our Recent Analysis

This is an update to our 2015 Scientific Reports paper: Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise. The paper used a novel statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records. We used our statistical estimate of unforced variability to aid in our interpretation of recently observed temperature variability (more info here).

Our paper used global temperature data through 2013 since that was the most recent year in the major global temperature datasets at the time that the paper was submitted. Below I update Figures 2 and 3 from the paper, incorporating the back-to-back record breaking warmth of 2014 and 2015.

Screen Shot 2016-06-13 at 10.07.30 AM

 

 

Figure 2 updated to include 2014 and 2015.

SciRep 2015 update Fig 3

Figure 3 updated to include 2014 and 2015.

The summary section of our paper stated:

We find that the interdecadal variability in the rate of global warming over the 20th century (i.e., acceleration from ~1910–1940, deceleration until ~1975, acceleration until ~2000) is within the 2.5–97.5% EUN, even if the forced signal is represented as a linear trend, indicating that this observed interdecadal variability in the rate of warming does not necessarily require interdecadal variability in the rate-of-increase of the forced signal.

This statement was about 20th century temperature and thus updates for 2014 and 2015 are somewhat irrelevant. Nevertheless, the updated Figure 2 (bottom left panel) indicates that recent warmth is just now starting to emerge from a linear-trend null hypothesis. This is not to say that a linear trend is the most likely representation of the forced component of variability – it just means that the linear trend forced component can’t quite be ruled out. This is now starting to change as observations move above the 97.5th percentile of the unforced range.

The summary section also stated:

We also find that recently observed GMT values, as well as trends, are near the lower bounds of the EUN for a forced signal corresponding to the RCP 8.5 emissions scenario but that observations are not inconsistent with a forced signal corresponding to the RCP 6.0 emissions scenario.

Note that we were not making a forecast about how likely the RCP 8.5 emissions scenario was. Instead, we were using the the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring over the recent past (see here and here for further clarification).

The Figure 3 indicates that with the updated data, no trend over the past 25 years falls outside of the 5-95% range for any of the scenarios. The trends over the most recent ~5 years are higher than average for all the scenarios but still well within the range of unforced variability. Over the past 10-20 years, observed trends have been on the lower end of the RCP 8.5 range but closer to the middle of the RCP6.0 range. This indicates that over the past 10-20 years it may be more likely that we have been on a RCP6.0-like warming trajectory than a RCP8.5-like warming trajectory. This is similar to the conclusion of the original study.

Posted in Climate Change | Leave a comment

2015 Global Temperature vs. Models

2015 was the warmest year in the instrumental record (dating back to the mid/late 19th century) in all the major surface temperature datasets including NASA’s GISTEMP:

Screen Shot 2016-01-20 at 8.27.25 PM.pngHowever, 2015 still falls below the CMIP5 climate model mean value (left panel below). The difference between observations and the mean value from climate models  is often used as an estimate of the ‘unforced’ or ‘internal’ variability in global temperature (right panel blow). It is apparent from this estimate that there was an unforced cooling event from ~1998 to ~2013. Thus the 2015 record temperature does not ‘erase’ the hiatus – it is totally legitimate to study why observations diverged from the model mean over this time period.

Screen Shot 2016-01-20 at 8.26.22 PM.png

Because of the on-going El Nino event, 2016 will likely be even warmer than 2015 and thus 2016 may be above the climate model mean value for the 1st time since 1998. It will be very interesting to see what happens in 2017 and 2018. When neutral or La-Nina conditions return, will observations keep up with the steep rate of warming predicted by climate models?

Posted in Uncategorized | Leave a comment

Heat waves: How much can be blamed on global warming depends on how you ask the question.

It is well established that human-caused increases in greenhouse gasses are working to increase the average surface temperature of the planet on long timescales1. This fact, however, means very little in terms of the consequences that climate change might have on human society. People are affected far more by local weather extremes than by any change in global average temperature. Therefore, the connection between extreme weather events (like floods, droughts, hurricanes, tornadoes, heat waves, ect.) and global warming, has been of great interest to both scientists and the general public.

Any effect that global warming might have on extreme weather, however, is often difficult to ascertain. This is because extreme weather events tend to be influenced by a myriad of factors in addition to the average surface temperature. Hurricanes, for example, should tend to increase in strength as seas become warmer2 but we also expect that changes in wind shear3 (the change in wind direction with height) should cause a reduction in hurricane frequency4.

There are similar countering factors that must be weighed when assessing global warming’s impact on floods, droughts, and tornadoes. One type of extreme weather event, however, can be connected to global warming in a relatively straightforward manner: heat waves. Increasing greenhouse gasses have a direct effect on the probability distribution of surface temperatures at any given location. This means that when a heat wave occurs, it is safe to assume that global warming did have some impact on the event. How much of an impact, however, depends largely on how you frame the question.

Lets say that you live in a location that happens to experience a particular month when temperatures were far above average. Lets further imagine that three scientists assess the contribution from global warming and their findings are reported in three news stories that use the following headlines:

Headline A: Scientist finds that global warming increased the odds of the recent heat wave by only 0.25%. 

Headline B: Scientist finds that recent heat wave was due 71% to natural variability and due 29% to global warming.  

Headline C: Scientist finds that global warming has made heat waves like the recent one occur 23 times more often than they would have otherwise.

These three headlines seem to be incompatible and one might think that the three scientists fundamentally disagree on global warming’s role in the heat have. After all, Headline A makes it sound like global warming played a miniscule role, Headline B make it sound like global warming played a minor but appreciable role, and ‘Headline C’ makes it sound like global warming played a enormous role.

Perhaps surprisingly, these headlines are not mutually exclusive and they could all be technically correct in describing a particular heat wave. This article explores how these different sounding conclusions can be drawn from looking at the same data and asking slightly different questions.

The actual numbers for the headlines above correspond to a real event: The monthly average temperature of March 2012 in Durham, North Carolina5. I selected Durham for this example simply because it is where I live and March 2012 was selected because it was the warmest month (relative to the average temperature for each month of the year) that Durham has experienced over the past several decades. Now lets look at the specifics of how each headline was calculated.

Headline B: Calculating global warming’s contribution to the magnitude of the heat wave.

I will begin by explaining Headline B since it is probably the most straightforward calculation of the three. The left panel of the figure below shows the monthly “temperature anomaly” for Durham from 1900 to 20136. The temperature anomaly is the difference between the observed temperature for each month and the long-term average for that month of the year. So a temperature anomaly of +3°C would mean that month was 3°C above average. I use temperature anomalies because heat waves are defined as periods of time when temperatures are unusually warm relative to the average for that location and time of year.

The red line in the left panel below is an estimate of long-term global warming in Durham7 which is calculated from physics-based numerical climate models8. The red line incorporates natural influences like changes in solar output and volcanic activity but virtually all of the long-term warming is attributable to human-caused increases in greenhouse gasses. When I use the term global warming in this article I am specifically referring to the long-term upward trajectory of the “baseline climate” illustrated by the red line in the left panel.

So what would the temperature in Durham have looked like if there had been no global warming? We can calculate this by subtracting the estimate of global warming (red line) from each month’s temperature anomaly (black line). The result is shown in the right panel below. Notice how the right panel’s “baseline climate” is flat; indicating that there was no underlying climate change in this hypothetical scenario and all temperature variability came from natural fluctuations9. We can see that March 2012 would still have been a hot month even without global warming but that it would not have been as hot.

Fig_1

In fact, we can now see how headline B was calculated. If the total anomaly with global warming in March 2012 was +6°C and the contribution from natural variability was +4.25°C, then global warming contributed +1.75°C of the +6°C anomaly. To put it another way, the global warming contribution to the magnitude of the heat wave was 29% (1.75°C/6°C = 0.29) while the natural variability contribution to the magnitude of the heat wave was 71% (4.25°C/6°C = 0.71). It is interesting to notice that if March 2012 had been even hotter, then the contribution from global warming would actually have been less. Why? Because the contribution from global warming would have been the same (the red line would not change) so it would have been necessary for natural variability to have contributed even more to the magnitude of a hotter anomaly. For example, if March 2012 had been 8°C above average, then global warming would still have contributed 1.75°C which means global warming would only have contributed 1.75°C/8°C = 0.22 or 22% of the magnitude.

Headline B quantifies how much global warming contributed to the magnitude of the heat wave (how hot the heat wave was), but lets now turn our attention to how much global warming contributed to the likelihood that the heat wave would have occurred in the first place.

Headline A and C: Calculating global warming’s influence on the change in the likelihood of the heat wave.

The conclusions of Headlines A and C sound the most different but arriving at these numbers actually requires very similar calculations. To make these types of calculations it is often assumed that, in the absence of global warming, temperature anomalies follow some kind of a probability distribution. Because it is the most familiar, I will use the example of the normal distribution (a.k.a. Gaussian or bell-curve distribution) below10.

Fig_2

The next step is to notice how global warming has shifted the probability distribution over time11 (top panel below). This shows us how the +1.75°C change in the baseline temperature due to global warming has affected the probability of observing different temperature anomalies. Actually, we can now see how Headline A was calculated. Without global warming, an anomaly of +6°C or warmer was very unlikely – its chance of occurring in any given month was about 0.0117%. Even if we consider that global warming shifted the mean of the distribution by +1.75°C, an anomaly of +6°C or greater was still very unlikely – its chance of occurring in any given month was about 0.26%. So global warming increased the chance of the March 2012 Durham heat wave by 0.26% – 0.0117% = ~0.25%.

That doesn’t sound like a big change, however, this small shift in absolute probability translates into a big change in the expected frequency (how often such a heat wave should occur on average). The usual way to think about the expected frequency is to use the Return Time12 which is the average time that you would have to wait in order to observe an extreme at or above a certain level. The middle panel below shows the Return Times for Durham temperature anomalies both with and without global warming.

A probability of 0.0117% for a +6°C anomaly indicates that without global warming this would have been a once-in-a-8,547-month event (because 1/0.000117 = 8,547). However, a probability 0.26% for a +6°C anomaly indicates that with global warming this should be a once-in-a-379-month event (because 1/0.0026 = 379). Now we can see where Headline C came from: global warming made the expected frequency 23 times larger (because 8,547/379 = 23) so we expect to see a heat wave of this magnitude (or warmer) 23 times more often because of global warming.

Fig_3

In fact, from the bottom panel above we can see that the more extreme the heat wave, the more global warming will have increased it’s likelihood. This may seem counterintuitive because we have already seen that the greater the temperature anomaly, the less global warming contributed to its magnitude. This seemingly paradoxical result is illustrated in the Figure below. Essentially, it takes a large contribution from natural variability to get a very hot heat wave. However, the hotter the heat wave, the more global warming will have increased its likelihood.

Fig_4

So which of the three headlines is correct?

All the headlines are technically justifiable; they are simply answering different questions. Headline A answers the question: “How much did global warming change the absolute probability of a +6°C (or warmer) heat wave?” Headline B answers the question: “What proportion of the +6°C anomaly itself is due to global warming?” And Headline C answers the question: How much did global warming change the expected frequency of a +6°C (or warmer) heat wave?

In my judgment, only Headline A is fundamentally misleading. Since extremes have small probabilities by definition, a large relative change in the probability of an extreme will seem small when it is expressed in terms of the absolute change in probability. Headline B and Headline C, on the other hand, quantify different pieces of information that can both be valuable when thinking about global warming’s role in a heat wave.

Footnotes 

  1. The most comprehensive scientific evaluation of this statement is presented in the IPCC’s 2013, Working Group I, Chapter 10.
  2. Emanuel, K. 2005. Increasing destructiveness of tropical cyclones over the past 30 years, Nature, 436, 686-688.
  3. Vecchi, G. A., B. J. Soden. 2007. Increased tropical Atlantic wind shear in model projections of global warming, Res. Lett., 34, L08702, doi:10.1029/2006GL028905.
  4. Knutson, T. R., J. R. Sirutis, S. T. Garner, G. A. Vecchi, I. M. Held. 2008. Simulated reduction in Atlantic hurricane frequency under twenty-first-century warming conditions, Nature Geoscience, 1 359-364 doi:10.1038/ngeo202.
  5. Data from the Berkeley Earth Surface Temperature Dataset
  6. The temperature data used here are in degrees Celsius (°C). °C are 1.8 times larger than °F so a temperature anomaly of 6°C would be 1.8×6 = 10.8°F.
  7. The global warming signal is more technically referred to as the “externally forced component of temperature change”. This is the portion of temperature change that is imposed on the ocean-atmosphere-land system from the outside and it includes contributions from anthropogenic increases in greenhouse gasses, aerosols, and land-use change as well as changes in solar radiation and volcanic aerosols.
  8. Climate model output is the multi-model mean for Durham, NC from 27 models that participated in the CMIP5 Historical Experiment
  9. The technical terms for this type of variability are “unforced” or “internal” variability. This is the type of variability that spontaneously emerges from complex interactions between ocean, atmosphere and land surface and requires no explicit external cause.
  10. There is precedent for thinking of surface temperature anomalies as being normally distributed (e.g., Hansen et al., 2012). However, it should be noted that the specific quantitative results, though not the qualitative point, of this article are sensitive to the type of distribution assumed. In particular a more thorough analysis would pay close attention to the kurtosis of the distribution (i.e., the ‘fatness’ of the distribution’s tails) and would perhaps model it through a Generalized Pareto Distribution as is done in Otto et al., 2012 for example. Also, instead of fitting a predefined probability distribution to the data many stochastic simulations of temperature anomalies from a noise time series model or a physics-based climate model could be used to assess the likelihood of an extreme event Otto et al., 2012.
  • Hansen, J., M. Sato., R. Ruedy, 2012, Perception of climate change, PNAS, vol. 109 no. 37 doi: 10.1073/pnas.1205276109.
  • Otto, F. E. L., Massey, G. J. vanOldenborgh, R. G. Jones, and M. R. Allen, 2012, Reconciling two approaches to attribution of the 2010 Russian heat wave, Geophys. Res. Lett., 39, L04702, doi:10.1029/2011GL050422.
  1. For simplicity I assume that the variance of the distribution does not change over time and that global warming has only shifted the mean of the distribution.
  2. Return Times were calculated as the inverse of the Survival Function for each of the distributions.
Posted in Climate Change | Leave a comment

AGU Poster: Unforced Surface Air Temperature Anomalies and their Opposite Relationship with the TOA Energy Imbalance at Local and Global Scales

Screen Shot 2016-01-17 at 7.29.12 AM

Posted in Uncategorized | Leave a comment

Response to Robert Tracinski’s article: “What It Would Take to Prove Global Warming”

Libertarian writer Robert Tracinski’s recently wrote an article called “What It Would Take to Prove Global Warming” in which he challenged main stream climate science on a number of issues. I was asked by a few people to give my thoughts on the article so I have written an informal response to several of the article’s claims below.

Picking up where the article gets substantive…

When I refer to “global warming,” and when Bailey and Adler refer to it, that term is a stand-in, not just for the trivial claim that average global temperatures are rising, but for “catastrophic anthropogenic global warming”: i.e., global temperatures are rising, it’s our fault, and we’re all gonna die.

Response: Tracinski starts of by creating a straw man argument that is easy for him to defeat. Serious scientists/policy experts do not tout the claim that “we’re all gonna die”. The important question is not whether or not “we’re all going to die”, the important question is whether or not it would be a net benefit to society and the environment if we regulate/reduce greenhouse gas emissions.

I’ve gone on record a long time ago sketching out what stages would be required to demonstrate that humans are causing rising global temperatures, never mind the much more dubious proposition that warmer weather is going to be a catastrophe. Let me elaborate on it here.

There are three main requirements.

1) A clear understanding of the temperature record.

The warmists don’t just have to show that temperatures are getting warmer, because variation is normal. That’s what makes “climate change” such an appallingly stupid euphemism. The climate is always changing. The environmentalists are the real climate-change “deniers” because they basically want global temperatures to maintain absolute stasis relative to 1970—not coincidentally the point at which environmentalists first began paying any attention to the issue.

Response: It may be generally true that “variation is normal” but the rate of warming that we have observed over the past century has been demonstrated to be outside the range of natural variability. Most of the natural climate changes that Earth has experienced in the past have occurred at rates much slower than the climate change we are currently experiencing. For example, it took 10,000 years for the earth to warm 9 degrees Fahrenheit when we came out of the last ice age. If humans decide to burn all remaining fossil fuels, we are looking at a similar magnitude of warming over 200-300 years instead of 10,000. It is the rate of climate change, not necessarily the magnitude that has people most concerned.

The bottom line is that climate does change with or without human actions but science has demonstrated that humans are the dominant cause of warming over the past century and this warming is occurring at a rate that is much faster than previous natural climate changes.

So to demonstrate human-caused global warming, we would have to have a long-term temperature record that allows us to isolate what the normal baseline is, so we know what natural variation looks like and we can identify any un-natural, man-made effect. A big part of the problem is that we only have accurate global thermometer measurements going back 135 years—a blink of an eye on the time-scales that are relevant to determining natural variation of temperature. Within that, we only have a few decades of warming that could conceivably be blamed on human emissions of carbon dioxide: a minor run up in temperatures from the 1970s to the late 1990s.

Response: Coincidentally, I just published a study that looks at natural variability in temperature over the past 1000 years and compares that to the magnitude of temperature change that we have experienced recently. In order to get estimates of variability of the past 1000 years you need temperatures proxies – things in the environment that co-vary with temperature like ice layer size, tree ring width, ocean sediment chemistry ect. There is definitely a great deal of uncertainty associated with estimating temperature from 1000 years ago, but all the available evidence suggests that the warming we have experienced over the past century is larger than what could be expected from natural variation.

Since then, warming has leveled off (despite strenuous attempts to pretend otherwise). I think it’s impossible to claim, on that basis, that we even know what natural temperature variation is, much less to demonstrate that we’ve deviated from it.

Response: 2014 was the warmest year in the instrumental record (see graph below). The rate of global warming varies from decade to decade because of a lot of factors but clearly the long-term trend is up.

GISTEMP

Various environmentalist attempts to create a “hockey stick” that makes current temperatures look abnormal have been embarrassing failures, involving problems like an improper mixing of recent thermometer measurements with less accurate “proxy” measurements that estimate temperatures farther into the past. And they prove my point about warmists being believers in climate stasis. The hockey stick graphs all assume that global temperature have been basically flat for 2,000 or 10,000 years, so that minor recent warming looks like a radical departure. Who’s really denying climate change?

Response: Many groups of professional scientists (who may or may not be “environmentalists”) have used different pieces of evidence to estimate how temperatures have changed over the past millennium (see graph below). It is absolutely false to say that these temperature estimates “assume that global temperatures have been basically flat”. These studies do not assume anything about how temperatures vary in the past. Any ‘flatness’ that these graphs show is a result, not an assumption of the studies. Also, all of these studies show that the warming of the 20th century is abnormal compared to natural variability.

IPCC_Paleo

And if you look at temperatures on the really big scale, we’re all just playing for time until the next ice age comes.

Response: Human caused increases in greenhouse gasses already guarantee that we will not be headed into an ice age any time in the next few millennia.

Assuming we can eventually compile a temperature record that is long enough and reliable enough to distinguish the effect of human activity from natural variation, we would also have to understand how human beings are causing this effect. Which leads us to the second big requirement.

Response: We already have temperature records long enough and reliable enough to distinguish the effect of human activity. Tracinski just doesn’t like the implications of these records so he assumes they must be wrong.

2) A full understanding of the underlying physical mechanisms.

We have to know what physical mechanisms determine global temperatures and how they interact. The glibbest thing said by environmentalists—and proof that the person who says it has no understanding of science—is that human-caused global warming is “basic physics” because we know carbon dioxide is a greenhouse gas. Carbon dioxide is a very weak greenhouse gas and there is no theory that claims it can cause runaway warming all on its own.

Response: No scientist that I know of claims that CO2 causes ‘runaway’ warming. However, basic physics does tell us that carbon dioxide is a greenhouse gas and that all else being equal, increasing CO2 will increase temperature. CO2 may be a relatively weak greenhouse gas compared to some others but humans are putting so much of it into the atmosphere (about 40 billion metric tones per year) that it still has a large effect.

The warmists’ theory requires feedback mechanisms that amplify the effect of carbon dioxide. Without that, there is no human-caused global warming. But those feedback mechanisms are dubious, unproven assumptions.

Response: There is still human-caused global warming with or without “feedback mechanisms”. However it is true that feedbacks amplify the impact of increasing CO2. For example, if you increase CO2, you warm the planet. This causes ice to melt which causes the planet to reflect less solar energy back to space. This results in more solar energy being absorbed which causes further warming. Feedback mechanisms have been studied expensively and we know a lot about them. They certainly are not “dubious, unproven assumptions”.

Basic questions about the “sensitivity” of the climate to carbon dioxide have never been answered. Even Bailey admits this.

In recent years, there has [been] a lot of back and forth between researchers trying to refine their estimates of climate sensitivity. At the low end, some researchers think that temperatures would increase a comparatively trivial 1.5 degrees Celsius; on the high end, some worry it could go as high as high 6 degrees Celsius…. In a 2014 article in Geophysical Research Letters, a group of researchers calculated that it would take another 20 years of temperature observations for us to be confident that climate sensitivity is on the low end and more than 50 years of data to confirm the high end of the projections.

Response: It is true that there is a wide range of possibilities for how much warming we expect for a given amount of CO2 increase.

Well, fine then. Is it okay if we wait? (No, it isn’t, and I’ll get to the implications of that in a few moments.)

And this leaves out the possibility that the climate’s sensitivity to carbon dioxide is even lower, that other mechanisms such as cloud-formation might serve to dampen temperature increases.

Response: This possibility is “left out” because there is no evidence that it is a real possibility worth considering.

Recently, I was amused at news that new science is debunking the “low sodium” diet fad of the past few decades. It turns out that “the low levels of salt recommended by the government might actually be dangerous” (which is not so amusing). This seems like a timely warning. Like the human body, the global climate is a hugely complicated system with a lot of factors that interact. We’re not even close to understanding it all, and having the government jump in and pick sides risks cementing a premature “consensus.”

Response: I agree that the climate is extremely complex and there is always the potential for surprises. But there is a difference between not knowing everything and knowing nothing. The vast majority of scientists who study this issue would say that we know enough about climate to be confident that humans are currently causing warming that is above and beyond natural climate variability.

The immense, untamed complexity of the climate is reflected in the poor performance of computerized climate models, which leads us to our last major hurdle in proving the theory of global warming.

3) The ability to make forecasting models with a track record of accurate predictions over the very long term.

We don’t know whether current warming departs from natural variation, nor have scientists proven the underlying mechanisms by which humans could cause such an increase.

Response: Just to reiterate the points above, we do know that the warming over the past century departs from what would have happened without human greenhouse gas inputs and scientists absolutely understand the underlying physical mechanisms.

But even if we did know these things, we would have to be able to forecast with reasonable accuracy how big the effect is going to be. A very small warming may not even be noticeable or may have mostly salutary effects, such as a slightly longer growing season, whereas the impact of a much larger warming is likely to cause greater disruption.

I should also point out that the “catastrophic” part of “catastrophic anthropogenic global warming” is a much larger question that is even harder to forecast. For example, global warming was supposed to lead to more hurricanes, which is why movie posters for Al Gore’s An Inconvenient Truth featured a hurricane emerging from an industrial smokestack. Then hurricane activity in the Atlantic promptly receded to historical lows.

Response: I essentially agree with this point. The science on the connection between human-caused climate change and any change in hurricanes is fiercely debated in the scientific community and I believe it was scientifically irresponsible for An Inconvenient Truth to use the hurricane in its logo.

It’s pretty clear that scientists aren’t any good yet at making global climate forecasts. Current temperatures are at or below the low range of all of the climate models. Nobody predicted the recent 17-year-long temperature plateau. And while they can come up with ad hoc explanations after the fact for why the data don’t match their models, the whole point of a forecast is to be able to get the right answer before the data comes in.

Response: On the decade-to-decade timescale there are a lot of factors other than CO2 that effect the global temperature. That means that CO2 can go up for a couple decades while global temperatures remain flat. It also means that CO2 could remain flat for a couple decades while global temperatures warm. However, on longer timescales (e.g., 100 years) CO2 has a much more dominant impact on global temperature. It is this long timescale that is most important for climate policy.

Given the abysmal record of climate forecasting, we should tell the warmists to go back and make a new set of predictions, then come back to us in 20 or 30 years and tell us how these predictions panned out. Then we’ll talk.

Response: I assume that Tracinski is against climate change mitigation policy because he fears that regulating greenhouse gas emissions will negatively impact the economy. If that is the case, then it would be fair to point out that economic models have a poor track record of correctly forecasting the future. Therefore, I think Tracinski has created an unfair double standard. He demands that climate models have the ability to perfectly forecast the future while simultaneously giving a pass to economic models.

The bottom line is that the future is hard to predict. The precise amount of warming that we will get for a given change in CO2 is hard to predict and the economic impact of climate change mitigation policies are hard to predict. Any serious appraisal of a CO2 mitigation policy would take into account the uncertainty of both and would not unfairly pretend that one side of the balance has no uncertainty.

Ah, but we’re not going to be allowed to wait. And that’s one of the things that is deeply unscientific about the global warming hysteria. The climate is a subject which, by its nature, requires detailed study of events that take many decades to unfold. It is a field in which the only way to gain knowledge is through extreme patience: gather painstaking, accurate data over a period of centuries, chug away at making predictions, figure out 20 years later that they failed, try to discover why they failed, then start over with a new set of predictions and wait another 20 years. It’s the kind of field where a conscientious professional plugs away so maybe in some future century those who follow after him will finally be able to figure it all out.

Response: The science of climatology has already been going through this process for a long time. The greenhouse effect was discovered in the 1820s, the first scientific paper written about CO2’s impact on global temperature came out in 1896, and the first major assessment report on human-caused climate change came out in 1979.

Yet this is the field that has suddenly been imbued with the Fierce Urgency of Now. We have to know now what the climate will do over the next 100 years, we have to decide now, we have to act now. So every rule of good science gets trampled down in the stampede. Which also explains the partisan gap on this issue, because we all know which side of the political debate stands to benefit from the stampede. And it’s not the right.

Response: The reason people feel that this issue is so urgent is because in order to stabilize global temperature (stabilize not bring temperatures back down to where they were) we would need to reduce CO2 emissions by ~80% from their current level. However, CO2 emissions are only growing exponentially. All reasonable policy/economic outlooks say that it would take decades to stabilize CO2 emissions and many more decades to bring CO2 emissions down to a level to where global temperatures would stop rising. Because of these huge time-lags, avoiding large global warming in the coming centuries requires that we begin to reduce greenhouse gas emissions now.

Posted in Uncategorized | Leave a comment

Climate Model Primer

1) What is a climate model?

In order to correctly interpret climate model output it is important to first understand what a computer climate model is. Climate models are software that simulate the whole earth system by using our knowledge of physics at smaller spatial scales. Climate models break the entire earth into three-dimensional cubes (or grid boxes) and thousands of physical equations are used to simulate the state of each grid box as a function of time (Figure 1).

Because climate models are based on physics (as opposed to the statistics of past events), they can be used to make predictions that have no analogs in the historical record. For example, you could ask the question: “What would India’s climate be like if Africa didn’t exist?” Obviously you can’t use historical statistics to answer such a question but a climate model can make a reasonable prediction based on physics. Similarly, when a climate model projects the climate of 2100 under enhanced greenhouse gas concentrations, it is basing this projection on physics and not any historical analog.

11

Figure 1. Illustration of the three-dimensional grid of a climate model (from Ruddiman, 2000)

2) Future Climate Change Projections Based on Greenhouse Gas Scenarios

 Climate models are the primary tools used to project changes in the earth system under enhanced greenhouse gas concentrations. Climate models take information about greenhouse gas emissions or concentrations as an input and then simulate the earth system based on these numbers.

3) How greenhouse gas scenarios are input into the models

For CO2, a climate model that features a fully coupled “carbon cycle model” will only require anthropogenic emissions of CO2 as an input and it will predict the atmospheric concentrations of CO2. If a climate model does not have an embedded carbon cycle model, it will require atmospheric concentrations of CO2 as an input variable. The emissions and concentrations that are used as inputs to the model can be whatever the user desires.

4) Representation of greenhouse gasses in climate models

The most consequential effect of CO2 in a climate model is that it interacts differently with incoming shortwave radiation than it does with outgoing longwave radiation (this is essentially the definition of a greenhouse gas). The climate model keeps track of radiation in every atmospheric grid box (Figure 1) and at every time step. This is the primary way in which a change in greenhouse gasses will affect the climate of the model. As an example, if you double CO2 concentrations in a climate model, that model might predict that average rainfall intensity will increase over some arbitrary region. However, the only direct effect of changing CO2 concentration was that it changed outgoing longwave radiation. Any other change that the model simulates, e.g., the change in rainfall intensity, was secondary and came about due a series of physical links. These links are not necessarily obvious and many scientists spend their careers trying to figure them out.

5) Weather vs. Climate.

Climate is often colloquially defined as “average weather”. Therefore, in order to know the climate at a given location, you generally need weather records over a long period of time (usually 30 years or more). Weather is random and difficult to predict (it is nearly impossible to know if it will rain in New York on July 17th 2015). Climate, on the other hand, is relatively predictable (New York should expect about 120 mm of rain per July on average).

It is important to understand that climate models do not actually simulate climate. Instead, climate models simulate weather. For example, if you run a climate model from the year 2000 to the year 2100 (incorporating an estimated increase in greenhouse gas concentrations) the climate model will simulate the weather over the entire surface of the earth and output its calculations every three hours or so for 100 years. Therefore, the climate model will simulate the weather in New York on July 17th 2088. However, because weather is so unpredictable, nobody should take this “weather forecast” seriously. What may be taken seriously, however, are the long-term trends in July precipitation in New York over the entire course of the simulation.

6) How to interpret climate model output

Imagine that we have climate model output for some climate variable (at a given location) over the next 100 years under a scenario of increasing greenhouse gas concentrations (Figure 2A). We notice that the variable increases and peaks during “Time period A” and then decreases until “Time period B”. We want to know if this pattern is part of the climate change signal (i.e., did the increase in greenhouse gasses cause this pattern?) or if it is just the result of random weather noise. The only way to find out is to run the climate model again (allowing it to simulate different random weather) and see if the same pattern shows up again.

Figure 2B shows output from a 2nd climate model run which incorporated the same increase in greenhouse gas concentrations as the 1st. We see that, in this case, the variable is higher over “Time period B” than it was over “Time period A”. This indicates that the pattern we saw in the first run may have just been due to random weather noise and not a part of the climate change signal that we are most interested in. To be sure, we can look at a large number of different climate model runs that incorporate the same increase in greenhouse gas concentrations (Figure 2C). It becomes apparent that the common attribute between the runs is a long-term upward trend. If we have enough runs we can average them together to get an estimate of what the true climate change signal is (Figure 2D). Climate is the average weather so averaging over all of the models (taking the “multi-model-mean”) should leave only the portion of the change that is due to the increased greenhouse gas concentrations.

22

Figure 2. Illustration of the difference between the random weather produced by a single climate model run and the climate change signal that can only be seen when multiple climate model runs are averaged together.

7) Comparing the output of two different scenarios with different greenhouse increases

Figure 3A illustrates a hypothetical comparison of climate model output under two different greenhouse gas emissions scenarios. It would be difficult to interpret the results if we only had a single climate model run from each scenario because we might happen see a progression that is not representative of the underlying climate change signal. For example, it would be possible to see a green run that showed a larger increase in the variable than a blue run even though their respective climate change signals indicate the reverse. In order to compare the climate change signals between the two scenarios we need to average over a number of runs (Figure 3B). It is then possible to use the average and the spread about the average to test whether or not the difference in the emissions scenarios had a meaningful effect on the output variable at some time in the future.

33

Figure 3. Illustration of climate model output for two different emissions scenarios (blue and green) 

8) Issues to keep in mind when interpreting climate model output

  1. Signal to noise ratio may be very small

For many variables, the underlying climate change signal (thick lines in Figures 2 and 3B) may change only a tiny amount relative to how large the variations of the random weather noise is. This means that when we compare the multi-model-mean from one greenhouse gas emissions scenario to another, the difference we see may be due almost entirely to random weather noise instead of being due to some climate change signal.

  1. Climate change in the real world will not follow the smooth signal

There is random weather in the real climate system, just like there is in the individual climate model runs. So the true evolution of a variable in the future will not look like the smooth signal. Instead it will look like one of the individual erratic runs. This means that there is a very wide range of values that could be plausible for any given time in the future.

  1. It is difficult to validate climate models

Weather is unpredictable and thus we do not expect climate models to be able to reproduce the random weather that has occurred historically. Instead we expect climate models to be able to reproduce the underlying climate change signal that has occurred historically. Unfortunately, our observations over the past 50 to 100 years contain both random weather and a climate change signal but we don’t necessarily know which one is which.

For example, lets say that we observe an increase in drought at some location over the past 20 years. Then we run a set of climate models with historical increases in greenhouse gas concentrations and we see that the models do not suggest that there should have been an increase in drought in this location. This could mean one of two things: 1) the models are deficient and need to be improved before they can simulate this increase in drought or 2) the models are correct and the observed increase in drought was just due to random weather. It is very difficult to know which is the case and that makes it very difficult to assess how well the models are doing.

Since it is so hard to validate the climate models, it is difficult to know how confident we should be in their projections. Nevertheless their output is taken seriously because it probably represents our best quantitative estimate of the future based on our current knowledge of the physical workings of the climate system.

  1. Climate models are better at simulating larger spatial/temporal scales than smaller spatial/temporal scales

One problem with climate models is that their spatial and temporal resolution is too coarse to simulate many aspects of the climate that are relevant to society. For example, tornadoes are much smaller than the spatial grid (Figure 1) and much more short-lived than the time step (~3 hours) of a climate model so they cannot possibly be simulated. Even events such as floods tend to be caused by thunderstorms that are smaller-scale than the typical model’s resolution and thus it is difficult for models to correctly simulate the statistics of flooding events.

Posted in Uncategorized | Leave a comment

Global Warming and unforced variability: Clarifications on our recent study

Presentation1

This post appeared at Real Climate

We recently published a study in Scientific Reports titled “Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise”. Our study seemed to generated a lot of interest and we have received many inquires regarding its findings. We were pleased with some of coverage of our study (e.g., here) but we were disappointed that some outlets published particularly misleading articles (e.g, here, here, and here). Because there appears to be some confusion regarding our study’s findings, we would like to use this forum to clarify some points.

Our study is mainly about natural unforced (i.e., internally generated) variability in global mean surface temperature. This is the type of variability that comes from natural interactions between the ocean and the atmosphere (i.e., that due to phenomena like the El-Nino/Southern Oscillation or perhaps the Atlantic Multidecadal Oscillation). This is in contrast to externally forced variability in global mean surface temperature which arises due to changes in atmospheric greenhouse gasses, aerosols, solar irradiance, ect. Most previous estimates of the magnitude of unforced variability have come from physical climate models. In our study we created an alternative statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records. We then used this new estimate of unforced variability to aid in our interpretation of observed global mean temperature variability since 1900.

We found that unforced variability is large enough so that it could have accounted for multidecadal changes in the rate-of-increase of global average surface temperature over the 20th century. However, our estimate of unforced variability was NOT large enough to account for the total warming observed over the 20th century. Therefore, our results confirm that positive radiative forcings (e.g., from human-caused increases in greenhouse gas concentrations) are necessary in order for the Earth to have warmed as much as it did over the 20th century.

We also found that over the most recent decade-or-so, it is unlikely that the underlying global warming signal (the externally forced component of temperature change) has been increasing at a rate characteristic of the worst-case IPCC emissions scenario (more on this below).

This last finding is what generated most of the attention for our article but it would appear that the finding has been largely misinterpreted. Part of the confusion stems form the Duke University press release which used the headline “Global Warming More Moderate Than Worst-Case Models”. The news department created this headline as a replacement for our suggested headline of “Global Warming Progressing at Moderate Rate, Empirical Data Suggest”. The news department wanted a shorter headline that was easier for the public to understand. Unfortunately, the simplification led many to believe that our study made a forecast for moderate global warming in the future, when in fact our conclusion only applied to the recent past.

Below are some clarifications on specific questions that we have received:

Question: What does your study conclude about Climate Sensitivity (e.g., how much warming we expect for a given change in greenhouse gasses)?

Answer: Nothing. Our study was not concerned with assessing Climate Sensitivity and we have no particular reason to doubt the assessments of Climate Sensitivity from the IPCC.

Question: Does your study show that the climate models used by the IPCC are useless?

Answer: No. Results from our previous study indicated that the magnitude of unforced variability simulated by climate models may be underestimated on decadal and longer timescales and our new estimate of unforced variability largely supports this conclusion. However, our new statistical estimate of unforced variability is not radically different from that simulated by climate models and for the most part we find that climate models seem to get the ‘big picture’ correct.

Question: Does your study indicate that the warming from the 1970s to the present may be natural rather than human caused?

Answer: Our study is not explicitly an attribution study and we do not attempt to quantify the anthropogenic contribution to warming over the past ~40 years. However, we were interested in how much influence unforced variability might have had on changes in the rate of warming over the instrumental period. Specifically, we wanted to know if the real climate system is more like panel-a or panel-b below.

Screen Shot 2015-05-10 at 8.50.28 AM

In panel-a, the magnitude of unforced variability is small (represented by the narrow range between the blue lines), thus changes in the multidecadal rate of warming would necessarily be due to corresponding changes in the externally forced component of warming. In panel-b the magnitude of unforced variability is large (wide range between the blue lines) and thus changes in the multidecadal rate of warming could come about due to unforced variability.

The results of our study indicate that multidecadal changes in the rate of global warming can indeed come about due to unforced variability and thus the climate system may be more like panel-b than panel-a. This means that the accelerated warming over the last quarter of the 20th century does not ipso-facto require an acceleration in the forced component of warming. Instead, this accelerated warming could have come about due to a combination of anthropogenic forcing and unforced variability. This interpretation of the temperature record is consistent with the results of several recent studies.

Question: Does your study rule-out the rate of warming associated with the IPCC’s RCP 8.5 emissions scenario?

Answer: No. We used the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring currently. We then asked the question “how likely is it that the forced component of global warming has been this steep, given recent temperature trends?” We found that it was not very likely to observe an 11-year warming hiatus (2002-2013 in GISTEMP) if the underlying forced warming signal was progressing at a rate characteristic the RCP 8.5 scenario. Since the mean radiative forcing progression in RCP 8.5 is likely steeper than the radiative forcing progression of the recent past, this finding cannot be used to suggest that models are overestimating the response to forcings and it cannot be used to infer anything about future rates of warming.

We would invite all interested people to read the full paper (it is Open Access) for a more complete explanation and discussion of this complicated subject.

Posted in Uncategorized | Leave a comment