Why do climate models disagree on the size of global temperature variability?

We have published a new paper titled “Spread in the magnitude of climate model interdecadal global temperature variability traced to disagreements over high-latitude oceans“. Here is a brief summary:

Natural unforced variability in global mean surface air temperature (GMST) is of the same order of magnitude as current externally forced changes in GMST on decadal timescales. Thus, understanding the precise magnitude of unforced GMST variability is relevant for both the attribution of past climate changes to human causes as well to the prediction of climate change on policy-relevant timescales.

Climate models could be useful for estimating the true magnitude of unforced GMST variability provided that they more-or-less converge on the same answer. Unfortunately, current models show substantial disagreement on the magnitude of natural GMST variability, highlighting a key uncertainty in contemporary climate science. This large model spread must be narrowed in the future if we are to have confidence that models can be trusted to give useful insights on natural variability.

Since it is known that unforced GMST variability is heavily influenced by tropical Pacific surface temperatures, it might be tempting to suppose that the large inter-model spread in the simulated magnitude of GMST variability is due to model disagreement in the amount of simulated tropical Pacific variability. Perhaps surprisingly, our study shows that this is not the case and that the spread in the magnitude of model-simulated GMST variability is linked much more strongly to model disagreements over high-latitude oceans. Our findings suggesting that improving the simulation of air-sea interaction in these high-latitude ocean regions could narrow the range of simulated GMST variability, advance our fundamental understanding of natural variability, and appreciably improve our ability to forecast global warming on policy-relevant timescales.

Posted in Climate Change | Leave a comment

Video Summary of my PhD Dissertation

 

Posted in Climate Change | Leave a comment

What do historical temperature records tell us about natural variability in global temperature?

I have published an article, written for a general audience, summarizing the results of our 2015 Scientific Reports study.

Posted in Uncategorized | Leave a comment

Cloud feedback necessary for a basin-scale AMO

Screen Shot 2016-04-11 at 7.49.04 PM

We have recently published a study in Geophysical Research Letters titled “The necessity of cloud feedback for a basin-scale Atlantic Multidecadal Oscillation“.

The Atlantic Multidecadal Oscillation (AMO) – a basin-scale coherent oscillation of sea surface temperatures over the North Atlantic – is thought be one of the climate system’s most important modes of natural variability, affecting everything from drought to hurricane activity to natural fluctuations in global temperature. Traditionally, the basin-scale AMO has been explained as a direct consequence of variability in the Atlantic Ocean’s meridional overturning circulation (AMOC). In contrast, our study identifies atmospheric processes; specifically cloud feedback, as a necessary component for the existence of a basin-scale AMO, thus amending the canonical view of the AMO as a signature directly and solely attributable to oceanic processes.

Posted in Uncategorized | Leave a comment

The stability of unforced global temperature – In plain english

We have new published research that shows in detail why the earth’s temperature remains stable when it is not pushed by outside forcings. Below is a summary in plain english. For a more technical discussion see here.

  • The study is all about what climate does when it is not pushed by what we call external drivers
    • External drivers (or forcings) are things like changes in the amount of energy coming in from the sun or changes in the amount of greenhouse gasses in the atmosphere.
  • You might expect (and many people simply assume) that the climate should be stable when it is not pushed by these external drivers
    • What our study did was investigate this assumption in a lot of detail and it turns out its not quite so simple
  • Why is it not so simple? Many locations on earth experience positive feedbacks between temperature and absorbed energy. For example, if you have some natural warming where there is sea ice, you will melt some of the sea ice, melting the sea ice will cause more solar energy to be absorbed which will cause more warming and more melting. It turns out these types of positive feedbacks are working all over the surface of the planet.
  • So the question then becomes: If the Earth gets warmer naturally due to something like an El-Nino event, what’s stopping it from just continuing to warm? Can it cool itself back down? If so, how?
  • The study looks at this in detail and finds that some very interesting things are going on that allow that Earth to cool itself down after one of these unforced natural warming events:
    • It turns out that the earth naturally transports energy away from locations where there are positive feedbacks to locations where there are negative feedbacks.
    • Also the atmosphere rearranges clouds and water vapor in a way that allow much more energy to escape than we would expect otherwise.
  • These things are scientifically interesting but the bottom line that the general public should understand is that the earth is able to cool itself down after an unforced natural warming event like an el-Niño and thus in order for the earth to have sustained warming over multiple decades to a century, you need these external drivers (or forcings) like the increase in greenhouse gasses. This undermines the popular skeptic idea that the climate just drifts randomly from warm to cold and back again over many decades to centuries in an unpredictable manner.
Posted in Climate Change | Leave a comment

The stability of unforced global temperature – Technical Discussion

Screen Shot 2016-01-17 at 7.25.09 AM

We have new published research that has implications for why global mean surface air temperature (GMT) is stable in the absence of external radiative forcings.

One of the central differences between weather prediction and climate projection is that the former is considered to be an “initial value problem” and the latter is considered to be a “forced boundary condition problem” (1). This dichotomy implies that weather is subject to chaotic variability and thus is fundamentally unpredictable beyond several weeks but climate can be projected into the future with some confidence as long as changes in the boundary conditions of the system are known (2). For GMT, the fundamental boundary conditions are the system’s basic radiative properties, i.e., the incoming solar radiation, planetary albedo, and the atmospheric infrared transitivity governed by greenhouse gas concentrations (3).

In reality, however, the forced boundary condition paradigm is complicated by nonlinear, two-way, interactions within the climate system. In particular, planetary albedo and the greenhouse effect are themselves complex functions of GMT and its spatial distribution (4). Therefore, if GMT is to be projected to within some fairly narrow range for a given change in boundary conditions (i.e., an increase in the greenhouse effect), it must be the case that the climate system can damp any unforced (internally generated) GMT perturbations. This idea becomes clearer when GMT evolution is expressed as a perturbation away from an equilibrium value set by the boundary conditions (ΔT=GMT-GMTequilibrium). ΔT change can be expressed as the sum of forcings (F), feedbacks (λΔT) and heat fluxes between the upper ocean’s mixed layer and the ocean below the mixed layer (Q),

Screen Shot 2016-01-17 at 7.05.26 AM.                                                                                            [1]

In this formulation, F often represents external radiative forcings (e.g., changes in well-mixed greenhouse gasses, aerosol loading, incoming solar radiation, etc.), however, here we are concerned with the stability of ΔT in the absence of external forcings so F represents unforced energy imbalances at the top of the atmosphere (TOA) (5). C is the effective heat capacity of the land/atmosphere/ocean-mixed-layer system, λ is the feedback parameter (the reciprocal of the climate sensitivity parameter) and λΔT represents the radiative fast-feedbacks (positive downward) (6-11).

It is accepted that ΔT should be stable in the long run mostly because of the direct blackbody response of outgoing longwave radiation to ΔT change, which is often referred to as the Planck Response,

Screen Shot 2016-01-17 at 7.05.35 AM                                                                                [2]

where Te is the effective radiating temperature of the Earth (≈255K) and σ is the Stefan-Boltzmann constant (12). The negative sign indicates increased energy loss by the climate system with warming. λPlanck is typically incorporated into λ in [1] as the reference sensitivity, e.g., λ=(1−faPlanck, where fa denotes the feedback factor sum of the fast-feedbacks in the system (i.e., water vapor, lapse rate, surface albedo, and cloud feedbacks) (7). Net positive fast-feedbacks imply fa > 0. Therefore, larger positive fast feedbacks imply a less negative λ and a climate system that is less effective at damping ΔT anomalies. To make this idea more explicit, [1] can be discretized and rearranged to obtain,

Screen Shot 2016-01-17 at 7.05.43 AM                                                                                          [3]

where,                                                                                                                                                                   Screen Shot 2016-01-17 at 7.05.50 AM                                                                                                                               [4]

and

Screen Shot 2016-01-17 at 7.05.56 AM                                                                                               [5]

Now ΔT evolution is explicitly represented as a first-order autoregressive function. In this form, θ is the autoregressive parameter that can be thought of as a quantitative representation of the restoring force, or stability of ΔT. Thus it becomes evident that the relative magnitude of the fast-feedbacks in the system will play a central role in our ability to consider GMT as a forced boundary condition problem. In particular, when θ is positive, but << 1, the system experiences a strong restoring force and ΔT is heavily damped. However, when the feedbacks in the climate system nearly overwhelm the Planck Response (fa à 1, θ à 1), the restoring force for ΔT disappears. With no restoring force, ΔT would be free to evolve in a chaotic and unpredictable manner comparable to Brownian motion or a “random walk” (13). In this case, GMT could be considered to be “intransitive” (14) and it might be better categorized as an initial value problem than as a forced boundary condition problem.

Consequently, most of modern climate science rests critically on the notion that the Planck Response overwhelms positive radiative fast-feedbacks in the climate system. In our paper, however, we document that at the local level, positive fast-feedbacks actually overwhelm the Planck Response over most of the surface of the Earth. The objective of the paper was to investigate how this finding can be reconciled with an apparently stable GMT at the global spatial scale.

We resolved this apparent paradox by showing that an anomalously warm Earth tends to restore equilibrium in complex and previously unappreciated ways. Our study shows in detail why global temperature should be stable in the absence of external forcings and therefore why global temperature does not evolve chaotically in the long run. Therefore this work explains why large, sustained, changes in global temperature require external radiative forcings like increases in greenhouse gas concentrations.

We focused our analysis on 27 Atmosphere-Ocean General Circulation Models (AOGCMs) from the Coupled Model Intercomparison Project – Phase 5 (CMIP5) (15). We utilize unforced preindustrial control runs which, by definition, included no external radiative forcings and thus all variability emerged spontaneously from the internal dynamics of the modeled climate system. We used the first 200 years of each AOGCMs preindustrial control run and we linearly detrended all analyzed variables so that our analysis was not contaminated with possibly unphysical model drift that may have been a result of insufficient model spin-up. Because this detrending procedure forced the AOGCM runs to be stable over the 200-year period, we are implicitly studying the restoring force for ΔT relative to any 200-year trend present in the control runs. Consequently, we are limited to studying the physical explanation for the stability of ΔT at timescales smaller than 200-years.

For more information see the AGU poster or the paper:

Brown, P.T., W. Li, J.H. Jiang, H. Su (2016) Unforced surface air temperature variability and its contrasting relationship with the anomalous TOA energy flux at local and global spatial scales. Journal of Climate, doi:10.1175/JCLI-D-15-0384.1

References:

  1. Kirtman et al. (2013) Near-term Climate Change: Projections and Predictability. In: Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA).
  2. Hawkins E & Sutton R (2009) The Potential to Narrow Uncertainty in Regional Climate Predictions. Bulletin of the American Meteorological Society 90(8):1095-1107.
  3. Sagan C & Mullen G (1972) Earth and Mars: Evolution of Atmospheres and Surface Temperatures. Science 177(4043):52-56.
  4. Armour KC, Bitz CM, & Roe GH (2012) Time-Varying Climate Sensitivity from Regional Feedbacks. Journal of Climate 26(13):4518-4534.
  5. Brown PT, Li W, Li L, & Ming Y (2014) Top-of-Atmosphere Radiative Contribution to Unforced Decadal Global Temperature Variability in Climate Models. Geophysical Research Letters:2014GL060625.
  6. Wigley TML & Schlesinger ME (1985) Analytical solution for the effect of increasing CO2 on global mean temperature. Nature 315(6021):649-652.
  7. Baker MB & Roe GH (2009) The Shape of Things to Come: Why Is Climate Change So Predictable? Journal of Climate 22(17):4574-4589.
  8. Geoffroy O, et al. (2012) Transient Climate Response in a Two-Layer Energy-Balance Model. Part I: Analytical Solution and Parameter Calibration Using CMIP5 AOGCM Experiments. Journal of Climate 26(6):1841-1857.
  9. Held IM, et al. (2010) Probing the Fast and Slow Components of Global Warming by Returning Abruptly to Preindustrial Forcing. Journal of Climate 23(9):2418-2427.
  10. Wigley TML & Raper SCB (1990) Natural variability of the climate system and detection of the greenhouse effect. Nature 344(6264):324-327.
  11. Dickinson RE (1981) Convergence Rate and Stability of Ocean-Atmosphere Coupling Schemes with a Zero-Dimensional Climate Model. Journal of the Atmospheric Sciences 38(10):2112-2120.
  12. Hansen J. AL, D. Rind, G. Russell,P. Stone , et al. (1984) Climate Sensitivity: analysis of feedback mechanisms. . in Climate Processes and Climate Sensitivity, ed. JE Hansen, T Takahashi, Geophys. Monogr (Washington, DC: Am. Geophys. Union), pp 130-163.
  13. Hasselmann K (1976) Stochastic climate models Part I. Theory. Tellus 28(6):473-485.
  14. Lorenz E (1968) Climatic Determinism. Meteor. Monographs, Amer. Meteor. Soc. 25:1-3.
  15. Taylor KE, Stouffer RJ, & Meehl GA (2011) An Overview of CMIP5 and the Experiment Design. Bulletin of the American Meteorological Society 93(4):485-498.
Posted in Climate Change | Leave a comment

2015 Record Warmth: Update to Our Recent Analysis

This is an update to our 2015 Scientific Reports paper: Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise. The paper used a novel statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records. We used our statistical estimate of unforced variability to aid in our interpretation of recently observed temperature variability (more info here).

Our paper used global temperature data through 2013 since that was the most recent year in the major global temperature datasets at the time that the paper was submitted. Below I update Figures 2 and 3 from the paper, incorporating the back-to-back record breaking warmth of 2014 and 2015.

Screen Shot 2016-06-13 at 10.07.30 AM

 

 

Figure 2 updated to include 2014 and 2015.

SciRep 2015 update Fig 3

Figure 3 updated to include 2014 and 2015.

The summary section of our paper stated:

We find that the interdecadal variability in the rate of global warming over the 20th century (i.e., acceleration from ~1910–1940, deceleration until ~1975, acceleration until ~2000) is within the 2.5–97.5% EUN, even if the forced signal is represented as a linear trend, indicating that this observed interdecadal variability in the rate of warming does not necessarily require interdecadal variability in the rate-of-increase of the forced signal.

This statement was about 20th century temperature and thus updates for 2014 and 2015 are somewhat irrelevant. Nevertheless, the updated Figure 2 (bottom left panel) indicates that recent warmth is just now starting to emerge from a linear-trend null hypothesis. This is not to say that a linear trend is the most likely representation of the forced component of variability – it just means that the linear trend forced component can’t quite be ruled out. This is now starting to change as observations move above the 97.5th percentile of the unforced range.

The summary section also stated:

We also find that recently observed GMT values, as well as trends, are near the lower bounds of the EUN for a forced signal corresponding to the RCP 8.5 emissions scenario but that observations are not inconsistent with a forced signal corresponding to the RCP 6.0 emissions scenario.

Note that we were not making a forecast about how likely the RCP 8.5 emissions scenario was. Instead, we were using the the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring over the recent past (see here and here for further clarification).

The Figure 3 indicates that with the updated data, no trend over the past 25 years falls outside of the 5-95% range for any of the scenarios. The trends over the most recent ~5 years are higher than average for all the scenarios but still well within the range of unforced variability. Over the past 10-20 years, observed trends have been on the lower end of the RCP 8.5 range but closer to the middle of the RCP6.0 range. This indicates that over the past 10-20 years it may be more likely that we have been on a RCP6.0-like warming trajectory than a RCP8.5-like warming trajectory. This is similar to the conclusion of the original study.

Posted in Climate Change | Leave a comment

2015 Global Temperature vs. Models

2015 was the warmest year in the instrumental record (dating back to the mid/late 19th century) in all the major surface temperature datasets including NASA’s GISTEMP:

Screen Shot 2016-01-20 at 8.27.25 PM.pngHowever, 2015 still falls below the CMIP5 climate model mean value (left panel below). The difference between observations and the mean value from climate models  is often used as an estimate of the ‘unforced’ or ‘internal’ variability in global temperature (right panel blow). It is apparent from this estimate that there was an unforced cooling event from ~1998 to ~2013. Thus the 2015 record temperature does not ‘erase’ the hiatus – it is totally legitimate to study why observations diverged from the model mean over this time period.

Screen Shot 2016-01-20 at 8.26.22 PM.png

Because of the on-going El Nino event, 2016 will likely be even warmer than 2015 and thus 2016 may be above the climate model mean value for the 1st time since 1998. It will be very interesting to see what happens in 2017 and 2018. When neutral or La-Nina conditions return, will observations keep up with the steep rate of warming predicted by climate models?

Posted in Uncategorized | Leave a comment

Heat waves: How much can be blamed on global warming depends on how you ask the question.

It is well established that human-caused increases in greenhouse gasses are working to increase the average surface temperature of the planet on long timescales1. This fact, however, means very little in terms of the consequences that climate change might have on human society. People are affected far more by local weather extremes than by any change in global average temperature. Therefore, the connection between extreme weather events (like floods, droughts, hurricanes, tornadoes, heat waves, ect.) and global warming, has been of great interest to both scientists and the general public.

Any effect that global warming might have on extreme weather, however, is often difficult to ascertain. This is because extreme weather events tend to be influenced by a myriad of factors in addition to the average surface temperature. Hurricanes, for example, should tend to increase in strength as seas become warmer2 but we also expect that changes in wind shear3 (the change in wind direction with height) should cause a reduction in hurricane frequency4.

There are similar countering factors that must be weighed when assessing global warming’s impact on floods, droughts, and tornadoes. One type of extreme weather event, however, can be connected to global warming in a relatively straightforward manner: heat waves. Increasing greenhouse gasses have a direct effect on the probability distribution of surface temperatures at any given location. This means that when a heat wave occurs, it is safe to assume that global warming did have some impact on the event. How much of an impact, however, depends largely on how you frame the question.

Lets say that you live in a location that happens to experience a particular month when temperatures were far above average. Lets further imagine that three scientists assess the contribution from global warming and their findings are reported in three news stories that use the following headlines:

Headline A: Scientist finds that global warming increased the odds of the recent heat wave by only 0.25%. 

Headline B: Scientist finds that recent heat wave was due 71% to natural variability and due 29% to global warming.  

Headline C: Scientist finds that global warming has made heat waves like the recent one occur 23 times more often than they would have otherwise.

These three headlines seem to be incompatible and one might think that the three scientists fundamentally disagree on global warming’s role in the heat have. After all, Headline A makes it sound like global warming played a miniscule role, Headline B make it sound like global warming played a minor but appreciable role, and ‘Headline C’ makes it sound like global warming played a enormous role.

Perhaps surprisingly, these headlines are not mutually exclusive and they could all be technically correct in describing a particular heat wave. This article explores how these different sounding conclusions can be drawn from looking at the same data and asking slightly different questions.

The actual numbers for the headlines above correspond to a real event: The monthly average temperature of March 2012 in Durham, North Carolina5. I selected Durham for this example simply because it is where I live and March 2012 was selected because it was the warmest month (relative to the average temperature for each month of the year) that Durham has experienced over the past several decades. Now lets look at the specifics of how each headline was calculated.

Headline B: Calculating global warming’s contribution to the magnitude of the heat wave.

I will begin by explaining Headline B since it is probably the most straightforward calculation of the three. The left panel of the figure below shows the monthly “temperature anomaly” for Durham from 1900 to 20136. The temperature anomaly is the difference between the observed temperature for each month and the long-term average for that month of the year. So a temperature anomaly of +3°C would mean that month was 3°C above average. I use temperature anomalies because heat waves are defined as periods of time when temperatures are unusually warm relative to the average for that location and time of year.

The red line in the left panel below is an estimate of long-term global warming in Durham7 which is calculated from physics-based numerical climate models8. The red line incorporates natural influences like changes in solar output and volcanic activity but virtually all of the long-term warming is attributable to human-caused increases in greenhouse gasses. When I use the term global warming in this article I am specifically referring to the long-term upward trajectory of the “baseline climate” illustrated by the red line in the left panel.

So what would the temperature in Durham have looked like if there had been no global warming? We can calculate this by subtracting the estimate of global warming (red line) from each month’s temperature anomaly (black line). The result is shown in the right panel below. Notice how the right panel’s “baseline climate” is flat; indicating that there was no underlying climate change in this hypothetical scenario and all temperature variability came from natural fluctuations9. We can see that March 2012 would still have been a hot month even without global warming but that it would not have been as hot.

Fig_1

In fact, we can now see how headline B was calculated. If the total anomaly with global warming in March 2012 was +6°C and the contribution from natural variability was +4.25°C, then global warming contributed +1.75°C of the +6°C anomaly. To put it another way, the global warming contribution to the magnitude of the heat wave was 29% (1.75°C/6°C = 0.29) while the natural variability contribution to the magnitude of the heat wave was 71% (4.25°C/6°C = 0.71). It is interesting to notice that if March 2012 had been even hotter, then the contribution from global warming would actually have been less. Why? Because the contribution from global warming would have been the same (the red line would not change) so it would have been necessary for natural variability to have contributed even more to the magnitude of a hotter anomaly. For example, if March 2012 had been 8°C above average, then global warming would still have contributed 1.75°C which means global warming would only have contributed 1.75°C/8°C = 0.22 or 22% of the magnitude.

Headline B quantifies how much global warming contributed to the magnitude of the heat wave (how hot the heat wave was), but lets now turn our attention to how much global warming contributed to the likelihood that the heat wave would have occurred in the first place.

Headline A and C: Calculating global warming’s influence on the change in the likelihood of the heat wave.

The conclusions of Headlines A and C sound the most different but arriving at these numbers actually requires very similar calculations. To make these types of calculations it is often assumed that, in the absence of global warming, temperature anomalies follow some kind of a probability distribution. Because it is the most familiar, I will use the example of the normal distribution (a.k.a. Gaussian or bell-curve distribution) below10.

Fig_2

The next step is to notice how global warming has shifted the probability distribution over time11 (top panel below). This shows us how the +1.75°C change in the baseline temperature due to global warming has affected the probability of observing different temperature anomalies. Actually, we can now see how Headline A was calculated. Without global warming, an anomaly of +6°C or warmer was very unlikely – its chance of occurring in any given month was about 0.0117%. Even if we consider that global warming shifted the mean of the distribution by +1.75°C, an anomaly of +6°C or greater was still very unlikely – its chance of occurring in any given month was about 0.26%. So global warming increased the chance of the March 2012 Durham heat wave by 0.26% – 0.0117% = ~0.25%.

That doesn’t sound like a big change, however, this small shift in absolute probability translates into a big change in the expected frequency (how often such a heat wave should occur on average). The usual way to think about the expected frequency is to use the Return Time12 which is the average time that you would have to wait in order to observe an extreme at or above a certain level. The middle panel below shows the Return Times for Durham temperature anomalies both with and without global warming.

A probability of 0.0117% for a +6°C anomaly indicates that without global warming this would have been a once-in-a-8,547-month event (because 1/0.000117 = 8,547). However, a probability 0.26% for a +6°C anomaly indicates that with global warming this should be a once-in-a-379-month event (because 1/0.0026 = 379). Now we can see where Headline C came from: global warming made the expected frequency 23 times larger (because 8,547/379 = 23) so we expect to see a heat wave of this magnitude (or warmer) 23 times more often because of global warming.

Fig_3

In fact, from the bottom panel above we can see that the more extreme the heat wave, the more global warming will have increased it’s likelihood. This may seem counterintuitive because we have already seen that the greater the temperature anomaly, the less global warming contributed to its magnitude. This seemingly paradoxical result is illustrated in the Figure below. Essentially, it takes a large contribution from natural variability to get a very hot heat wave. However, the hotter the heat wave, the more global warming will have increased its likelihood.

Fig_4

So which of the three headlines is correct?

All the headlines are technically justifiable; they are simply answering different questions. Headline A answers the question: “How much did global warming change the absolute probability of a +6°C (or warmer) heat wave?” Headline B answers the question: “What proportion of the +6°C anomaly itself is due to global warming?” And Headline C answers the question: How much did global warming change the expected frequency of a +6°C (or warmer) heat wave?

In my judgment, only Headline A is fundamentally misleading. Since extremes have small probabilities by definition, a large relative change in the probability of an extreme will seem small when it is expressed in terms of the absolute change in probability. Headline B and Headline C, on the other hand, quantify different pieces of information that can both be valuable when thinking about global warming’s role in a heat wave.

Footnotes 

  1. The most comprehensive scientific evaluation of this statement is presented in the IPCC’s 2013, Working Group I, Chapter 10.
  2. Emanuel, K. 2005. Increasing destructiveness of tropical cyclones over the past 30 years, Nature, 436, 686-688.
  3. Vecchi, G. A., B. J. Soden. 2007. Increased tropical Atlantic wind shear in model projections of global warming, Res. Lett., 34, L08702, doi:10.1029/2006GL028905.
  4. Knutson, T. R., J. R. Sirutis, S. T. Garner, G. A. Vecchi, I. M. Held. 2008. Simulated reduction in Atlantic hurricane frequency under twenty-first-century warming conditions, Nature Geoscience, 1 359-364 doi:10.1038/ngeo202.
  5. Data from the Berkeley Earth Surface Temperature Dataset
  6. The temperature data used here are in degrees Celsius (°C). °C are 1.8 times larger than °F so a temperature anomaly of 6°C would be 1.8×6 = 10.8°F.
  7. The global warming signal is more technically referred to as the “externally forced component of temperature change”. This is the portion of temperature change that is imposed on the ocean-atmosphere-land system from the outside and it includes contributions from anthropogenic increases in greenhouse gasses, aerosols, and land-use change as well as changes in solar radiation and volcanic aerosols.
  8. Climate model output is the multi-model mean for Durham, NC from 27 models that participated in the CMIP5 Historical Experiment
  9. The technical terms for this type of variability are “unforced” or “internal” variability. This is the type of variability that spontaneously emerges from complex interactions between ocean, atmosphere and land surface and requires no explicit external cause.
  10. There is precedent for thinking of surface temperature anomalies as being normally distributed (e.g., Hansen et al., 2012). However, it should be noted that the specific quantitative results, though not the qualitative point, of this article are sensitive to the type of distribution assumed. In particular a more thorough analysis would pay close attention to the kurtosis of the distribution (i.e., the ‘fatness’ of the distribution’s tails) and would perhaps model it through a Generalized Pareto Distribution as is done in Otto et al., 2012 for example. Also, instead of fitting a predefined probability distribution to the data many stochastic simulations of temperature anomalies from a noise time series model or a physics-based climate model could be used to assess the likelihood of an extreme event Otto et al., 2012.
  • Hansen, J., M. Sato., R. Ruedy, 2012, Perception of climate change, PNAS, vol. 109 no. 37 doi: 10.1073/pnas.1205276109.
  • Otto, F. E. L., Massey, G. J. vanOldenborgh, R. G. Jones, and M. R. Allen, 2012, Reconciling two approaches to attribution of the 2010 Russian heat wave, Geophys. Res. Lett., 39, L04702, doi:10.1029/2011GL050422.
  1. For simplicity I assume that the variance of the distribution does not change over time and that global warming has only shifted the mean of the distribution.
  2. Return Times were calculated as the inverse of the Survival Function for each of the distributions.
Posted in Climate Change | Leave a comment

AGU Poster: Unforced Surface Air Temperature Anomalies and their Opposite Relationship with the TOA Energy Imbalance at Local and Global Scales

Screen Shot 2016-01-17 at 7.29.12 AM

Posted in Uncategorized | Leave a comment