2015 Record Warmth: Update to Our Recent Analysis

This is an update to our 2015 Scientific Reports paper: Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise. The paper used a novel statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records. We used our statistical estimate of unforced variability to aid in our interpretation of recently observed temperature variability (more info here).

Our paper used global temperature data through 2013 since that was the most recent year in the major global temperature datasets at the time that the paper was submitted. Below I update Figures 2 and 3 from the paper, incorporating the back-to-back record breaking warmth of 2014 and 2015.

Screen Shot 2016-06-13 at 10.07.30 AM

 

 

Figure 2 updated to include 2014 and 2015.

SciRep 2015 update Fig 3

Figure 3 updated to include 2014 and 2015.

The summary section of our paper stated:

We find that the interdecadal variability in the rate of global warming over the 20th century (i.e., acceleration from ~1910–1940, deceleration until ~1975, acceleration until ~2000) is within the 2.5–97.5% EUN, even if the forced signal is represented as a linear trend, indicating that this observed interdecadal variability in the rate of warming does not necessarily require interdecadal variability in the rate-of-increase of the forced signal.

This statement was about 20th century temperature and thus updates for 2014 and 2015 are somewhat irrelevant. Nevertheless, the updated Figure 2 (bottom left panel) indicates that recent warmth is just now starting to emerge from a linear-trend null hypothesis. This is not to say that a linear trend is the most likely representation of the forced component of variability – it just means that the linear trend forced component can’t quite be ruled out. This is now starting to change as observations move above the 97.5th percentile of the unforced range.

The summary section also stated:

We also find that recently observed GMT values, as well as trends, are near the lower bounds of the EUN for a forced signal corresponding to the RCP 8.5 emissions scenario but that observations are not inconsistent with a forced signal corresponding to the RCP 6.0 emissions scenario.

Note that we were not making a forecast about how likely the RCP 8.5 emissions scenario was. Instead, we were using the the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring over the recent past (see here and here for further clarification).

The Figure 3 indicates that with the updated data, no trend over the past 25 years falls outside of the 5-95% range for any of the scenarios. The trends over the most recent ~5 years are higher than average for all the scenarios but still well within the range of unforced variability. Over the past 10-20 years, observed trends have been on the lower end of the RCP 8.5 range but closer to the middle of the RCP6.0 range. This indicates that over the past 10-20 years it may be more likely that we have been on a RCP6.0-like warming trajectory than a RCP8.5-like warming trajectory. This is similar to the conclusion of the original study.

Posted in Climate Change | Leave a comment

2015 Global Temperature vs. Models

2015 was the warmest year in the instrumental record (dating back to the mid/late 19th century) in all the major surface temperature datasets including NASA’s GISTEMP:

Screen Shot 2016-01-20 at 8.27.25 PM.pngHowever, 2015 still falls below the CMIP5 climate model mean value (left panel below). The difference between observations and the mean value from climate models  is often used as an estimate of the ‘unforced’ or ‘internal’ variability in global temperature (right panel blow). It is apparent from this estimate that there was an unforced cooling event from ~1998 to ~2013. Thus the 2015 record temperature does not ‘erase’ the hiatus – it is totally legitimate to study why observations diverged from the model mean over this time period.

Screen Shot 2016-01-20 at 8.26.22 PM.png

Because of the on-going El Nino event, 2016 will likely be even warmer than 2015 and thus 2016 may be above the climate model mean value for the 1st time since 1998. It will be very interesting to see what happens in 2017 and 2018. When neutral or La-Nina conditions return, will observations keep up with the steep rate of warming predicted by climate models?

Posted in Uncategorized | Leave a comment

Heat waves: How much can be blamed on global warming depends on how you ask the question.

It is well established that human-caused increases in greenhouse gasses are working to increase the average surface temperature of the planet on long timescales1. This fact, however, means very little in terms of the consequences that climate change might have on human society. People are affected far more by local weather extremes than by any change in global average temperature. Therefore, the connection between extreme weather events (like floods, droughts, hurricanes, tornadoes, heat waves, ect.) and global warming, has been of great interest to both scientists and the general public.

Any effect that global warming might have on extreme weather, however, is often difficult to ascertain. This is because extreme weather events tend to be influenced by a myriad of factors in addition to the average surface temperature. Hurricanes, for example, should tend to increase in strength as seas become warmer2 but we also expect that changes in wind shear3 (the change in wind direction with height) should cause a reduction in hurricane frequency4.

There are similar countering factors that must be weighed when assessing global warming’s impact on floods, droughts, and tornadoes. One type of extreme weather event, however, can be connected to global warming in a relatively straightforward manner: heat waves. Increasing greenhouse gasses have a direct effect on the probability distribution of surface temperatures at any given location. This means that when a heat wave occurs, it is safe to assume that global warming did have some impact on the event. How much of an impact, however, depends largely on how you frame the question.

Lets say that you live in a location that happens to experience a particular month when temperatures were far above average. Lets further imagine that three scientists assess the contribution from global warming and their findings are reported in three news stories that use the following headlines:

Headline A: Scientist finds that global warming increased the odds of the recent heat wave by only 0.25%. 

Headline B: Scientist finds that recent heat wave was due 71% to natural variability and due 29% to global warming.  

Headline C: Scientist finds that global warming has made heat waves like the recent one occur 23 times more often than they would have otherwise.

These three headlines seem to be incompatible and one might think that the three scientists fundamentally disagree on global warming’s role in the heat have. After all, Headline A makes it sound like global warming played a miniscule role, Headline B make it sound like global warming played a minor but appreciable role, and ‘Headline C’ makes it sound like global warming played a enormous role.

Perhaps surprisingly, these headlines are not mutually exclusive and they could all be technically correct in describing a particular heat wave. This article explores how these different sounding conclusions can be drawn from looking at the same data and asking slightly different questions.

The actual numbers for the headlines above correspond to a real event: The monthly average temperature of March 2012 in Durham, North Carolina5. I selected Durham for this example simply because it is where I live and March 2012 was selected because it was the warmest month (relative to the average temperature for each month of the year) that Durham has experienced over the past several decades. Now lets look at the specifics of how each headline was calculated.

Headline B: Calculating global warming’s contribution to the magnitude of the heat wave.

I will begin by explaining Headline B since it is probably the most straightforward calculation of the three. The left panel of the figure below shows the monthly “temperature anomaly” for Durham from 1900 to 20136. The temperature anomaly is the difference between the observed temperature for each month and the long-term average for that month of the year. So a temperature anomaly of +3°C would mean that month was 3°C above average. I use temperature anomalies because heat waves are defined as periods of time when temperatures are unusually warm relative to the average for that location and time of year.

The red line in the left panel below is an estimate of long-term global warming in Durham7 which is calculated from physics-based numerical climate models8. The red line incorporates natural influences like changes in solar output and volcanic activity but virtually all of the long-term warming is attributable to human-caused increases in greenhouse gasses. When I use the term global warming in this article I am specifically referring to the long-term upward trajectory of the “baseline climate” illustrated by the red line in the left panel.

So what would the temperature in Durham have looked like if there had been no global warming? We can calculate this by subtracting the estimate of global warming (red line) from each month’s temperature anomaly (black line). The result is shown in the right panel below. Notice how the right panel’s “baseline climate” is flat; indicating that there was no underlying climate change in this hypothetical scenario and all temperature variability came from natural fluctuations9. We can see that March 2012 would still have been a hot month even without global warming but that it would not have been as hot.

Fig_1

In fact, we can now see how headline B was calculated. If the total anomaly with global warming in March 2012 was +6°C and the contribution from natural variability was +4.25°C, then global warming contributed +1.75°C of the +6°C anomaly. To put it another way, the global warming contribution to the magnitude of the heat wave was 29% (1.75°C/6°C = 0.29) while the natural variability contribution to the magnitude of the heat wave was 71% (4.25°C/6°C = 0.71). It is interesting to notice that if March 2012 had been even hotter, then the contribution from global warming would actually have been less. Why? Because the contribution from global warming would have been the same (the red line would not change) so it would have been necessary for natural variability to have contributed even more to the magnitude of a hotter anomaly. For example, if March 2012 had been 8°C above average, then global warming would still have contributed 1.75°C which means global warming would only have contributed 1.75°C/8°C = 0.22 or 22% of the magnitude.

Headline B quantifies how much global warming contributed to the magnitude of the heat wave (how hot the heat wave was), but lets now turn our attention to how much global warming contributed to the likelihood that the heat wave would have occurred in the first place.

Headline A and C: Calculating global warming’s influence on the change in the likelihood of the heat wave.

The conclusions of Headlines A and C sound the most different but arriving at these numbers actually requires very similar calculations. To make these types of calculations it is often assumed that, in the absence of global warming, temperature anomalies follow some kind of a probability distribution. Because it is the most familiar, I will use the example of the normal distribution (a.k.a. Gaussian or bell-curve distribution) below10.

Fig_2

The next step is to notice how global warming has shifted the probability distribution over time11 (top panel below). This shows us how the +1.75°C change in the baseline temperature due to global warming has affected the probability of observing different temperature anomalies. Actually, we can now see how Headline A was calculated. Without global warming, an anomaly of +6°C or warmer was very unlikely – its chance of occurring in any given month was about 0.0117%. Even if we consider that global warming shifted the mean of the distribution by +1.75°C, an anomaly of +6°C or greater was still very unlikely – its chance of occurring in any given month was about 0.26%. So global warming increased the chance of the March 2012 Durham heat wave by 0.26% – 0.0117% = ~0.25%.

That doesn’t sound like a big change, however, this small shift in absolute probability translates into a big change in the expected frequency (how often such a heat wave should occur on average). The usual way to think about the expected frequency is to use the Return Time12 which is the average time that you would have to wait in order to observe an extreme at or above a certain level. The middle panel below shows the Return Times for Durham temperature anomalies both with and without global warming.

A probability of 0.0117% for a +6°C anomaly indicates that without global warming this would have been a once-in-a-8,547-month event (because 1/0.000117 = 8,547). However, a probability 0.26% for a +6°C anomaly indicates that with global warming this should be a once-in-a-379-month event (because 1/0.0026 = 379). Now we can see where Headline C came from: global warming made the expected frequency 23 times larger (because 8,547/379 = 23) so we expect to see a heat wave of this magnitude (or warmer) 23 times more often because of global warming.

Fig_3

In fact, from the bottom panel above we can see that the more extreme the heat wave, the more global warming will have increased it’s likelihood. This may seem counterintuitive because we have already seen that the greater the temperature anomaly, the less global warming contributed to its magnitude. This seemingly paradoxical result is illustrated in the Figure below. Essentially, it takes a large contribution from natural variability to get a very hot heat wave. However, the hotter the heat wave, the more global warming will have increased its likelihood.

Fig_4

So which of the three headlines is correct?

All the headlines are technically justifiable; they are simply answering different questions. Headline A answers the question: “How much did global warming change the absolute probability of a +6°C (or warmer) heat wave?” Headline B answers the question: “What proportion of the +6°C anomaly itself is due to global warming?” And Headline C answers the question: How much did global warming change the expected frequency of a +6°C (or warmer) heat wave?

In my judgment, only Headline A is fundamentally misleading. Since extremes have small probabilities by definition, a large relative change in the probability of an extreme will seem small when it is expressed in terms of the absolute change in probability. Headline B and Headline C, on the other hand, quantify different pieces of information that can both be valuable when thinking about global warming’s role in a heat wave.

Footnotes 

  1. The most comprehensive scientific evaluation of this statement is presented in the IPCC’s 2013, Working Group I, Chapter 10.
  2. Emanuel, K. 2005. Increasing destructiveness of tropical cyclones over the past 30 years, Nature, 436, 686-688.
  3. Vecchi, G. A., B. J. Soden. 2007. Increased tropical Atlantic wind shear in model projections of global warming, Res. Lett., 34, L08702, doi:10.1029/2006GL028905.
  4. Knutson, T. R., J. R. Sirutis, S. T. Garner, G. A. Vecchi, I. M. Held. 2008. Simulated reduction in Atlantic hurricane frequency under twenty-first-century warming conditions, Nature Geoscience, 1 359-364 doi:10.1038/ngeo202.
  5. Data from the Berkeley Earth Surface Temperature Dataset
  6. The temperature data used here are in degrees Celsius (°C). °C are 1.8 times larger than °F so a temperature anomaly of 6°C would be 1.8×6 = 10.8°F.
  7. The global warming signal is more technically referred to as the “externally forced component of temperature change”. This is the portion of temperature change that is imposed on the ocean-atmosphere-land system from the outside and it includes contributions from anthropogenic increases in greenhouse gasses, aerosols, and land-use change as well as changes in solar radiation and volcanic aerosols.
  8. Climate model output is the multi-model mean for Durham, NC from 27 models that participated in the CMIP5 Historical Experiment
  9. The technical terms for this type of variability are “unforced” or “internal” variability. This is the type of variability that spontaneously emerges from complex interactions between ocean, atmosphere and land surface and requires no explicit external cause.
  10. There is precedent for thinking of surface temperature anomalies as being normally distributed (e.g., Hansen et al., 2012). However, it should be noted that the specific quantitative results, though not the qualitative point, of this article are sensitive to the type of distribution assumed. In particular a more thorough analysis would pay close attention to the kurtosis of the distribution (i.e., the ‘fatness’ of the distribution’s tails) and would perhaps model it through a Generalized Pareto Distribution as is done in Otto et al., 2012 for example. Also, instead of fitting a predefined probability distribution to the data many stochastic simulations of temperature anomalies from a noise time series model or a physics-based climate model could be used to assess the likelihood of an extreme event Otto et al., 2012.
  • Hansen, J., M. Sato., R. Ruedy, 2012, Perception of climate change, PNAS, vol. 109 no. 37 doi: 10.1073/pnas.1205276109.
  • Otto, F. E. L., Massey, G. J. vanOldenborgh, R. G. Jones, and M. R. Allen, 2012, Reconciling two approaches to attribution of the 2010 Russian heat wave, Geophys. Res. Lett., 39, L04702, doi:10.1029/2011GL050422.
  1. For simplicity I assume that the variance of the distribution does not change over time and that global warming has only shifted the mean of the distribution.
  2. Return Times were calculated as the inverse of the Survival Function for each of the distributions.
Posted in Climate Change | Leave a comment

AGU Poster: Unforced Surface Air Temperature Anomalies and their Opposite Relationship with the TOA Energy Imbalance at Local and Global Scales

Screen Shot 2016-01-17 at 7.29.12 AM

Posted in Uncategorized | Leave a comment

Response to Robert Tracinski’s article: “What It Would Take to Prove Global Warming”

Libertarian writer Robert Tracinski’s recently wrote an article called “What It Would Take to Prove Global Warming” in which he challenged mainstream climate science on a number of issues. I was asked by a few people to give my thoughts on the article so I have written an informal response to several of the article’s claims below.

Picking up where the article gets substantive…

When I refer to “global warming,” and when Bailey and Adler refer to it, that term is a stand-in, not just for the trivial claim that average global temperatures are rising, but for “catastrophic anthropogenic global warming”: i.e., global temperatures are rising, it’s our fault, and we’re all gonna die.

Response: Tracinski starts of by creating a straw man argument that is easy for him to defeat. Serious scientists/policy experts do not tout the claim that “we’re all gonna die”. The important question is not whether or not “we’re all going to die”, the important question is whether or not it would be a net benefit to society and the environment if we regulate/reduce greenhouse gas emissions.

I’ve gone on record a long time ago sketching out what stages would be required to demonstrate that humans are causing rising global temperatures, never mind the much more dubious proposition that warmer weather is going to be a catastrophe. Let me elaborate on it here.

There are three main requirements.

1) A clear understanding of the temperature record.

The warmists don’t just have to show that temperatures are getting warmer, because variation is normal. That’s what makes “climate change” such an appallingly stupid euphemism. The climate is always changing. The environmentalists are the real climate-change “deniers” because they basically want global temperatures to maintain absolute stasis relative to 1970—not coincidentally the point at which environmentalists first began paying any attention to the issue.

Response: It may be generally true that “variation is normal” but the rate of warming that we have observed over the past century has been demonstrated to be outside the range of natural variability. Most of the natural climate changes that Earth has experienced in the past have occurred at rates much slower than the climate change we are currently experiencing. For example, it took 10,000 years for the earth to warm 9 degrees Fahrenheit when we came out of the last ice age. If humans decide to burn all remaining fossil fuels, we are looking at a similar magnitude of warming over 200-300 years instead of 10,000. It is the rate of climate change, not necessarily the magnitude that has people most concerned.

The bottom line is that climate does change with or without human actions but science has demonstrated that humans are the dominant cause of warming over the past century and this warming is occurring at a rate that is much faster than previous natural climate changes.

So to demonstrate human-caused global warming, we would have to have a long-term temperature record that allows us to isolate what the normal baseline is, so we know what natural variation looks like and we can identify any un-natural, man-made effect. A big part of the problem is that we only have accurate global thermometer measurements going back 135 years—a blink of an eye on the time-scales that are relevant to determining natural variation of temperature. Within that, we only have a few decades of warming that could conceivably be blamed on human emissions of carbon dioxide: a minor run up in temperatures from the 1970s to the late 1990s.

Response: Coincidentally, I just published a study that looks at natural variability in temperature over the past 1000 years and compares that to the magnitude of temperature change that we have experienced recently. In order to get estimates of variability of the past 1000 years you need temperatures proxies – things in the environment that co-vary with temperature like ice layer size, tree ring width, ocean sediment chemistry ect. There is definitely a great deal of uncertainty associated with estimating temperature from 1000 years ago, but all the available evidence suggests that the warming we have experienced over the past century is larger than what could be expected from natural variation.

Since then, warming has leveled off (despite strenuous attempts to pretend otherwise). I think it’s impossible to claim, on that basis, that we even know what natural temperature variation is, much less to demonstrate that we’ve deviated from it.

Response: 2014 was the warmest year in the instrumental record (see graph below). The rate of global warming varies from decade to decade because of a lot of factors but clearly the long-term trend is up.

GISTEMP

Various environmentalist attempts to create a “hockey stick” that makes current temperatures look abnormal have been embarrassing failures, involving problems like an improper mixing of recent thermometer measurements with less accurate “proxy” measurements that estimate temperatures farther into the past. And they prove my point about warmists being believers in climate stasis. The hockey stick graphs all assume that global temperature have been basically flat for 2,000 or 10,000 years, so that minor recent warming looks like a radical departure. Who’s really denying climate change?

Response: Many groups of professional scientists (who may or may not be “environmentalists”) have used different pieces of evidence to estimate how temperatures have changed over the past millennium (see graph below). It is absolutely false to say that these temperature estimates “assume that global temperatures have been basically flat”. These studies do not assume anything about how temperatures vary in the past. Any ‘flatness’ that these graphs show is a result, not an assumption of the studies. Also, all of these studies show that the warming of the 20th century is abnormal compared to natural variability.

IPCC_Paleo

And if you look at temperatures on the really big scale, we’re all just playing for time until the next ice age comes.

Response: Human caused increases in greenhouse gasses already guarantee that we will not be headed into an ice age any time in the next few millennia.

Assuming we can eventually compile a temperature record that is long enough and reliable enough to distinguish the effect of human activity from natural variation, we would also have to understand how human beings are causing this effect. Which leads us to the second big requirement.

Response: We already have temperature records long enough and reliable enough to distinguish the effect of human activity. Tracinski just doesn’t like the implications of these records so he assumes they must be wrong.

2) A full understanding of the underlying physical mechanisms.

We have to know what physical mechanisms determine global temperatures and how they interact. The glibbest thing said by environmentalists—and proof that the person who says it has no understanding of science—is that human-caused global warming is “basic physics” because we know carbon dioxide is a greenhouse gas. Carbon dioxide is a very weak greenhouse gas and there is no theory that claims it can cause runaway warming all on its own.

Response: No scientist that I know of claims that CO2 causes ‘runaway’ warming. However, basic physics does tell us that carbon dioxide is a greenhouse gas and that all else being equal, increasing CO2 will increase temperature. CO2 may be a relatively weak greenhouse gas compared to some others but humans are putting so much of it into the atmosphere (about 40 billion metric tones per year) that it still has a large effect.

The warmists’ theory requires feedback mechanisms that amplify the effect of carbon dioxide. Without that, there is no human-caused global warming. But those feedback mechanisms are dubious, unproven assumptions.

Response: There is still human-caused global warming with or without “feedback mechanisms”. However it is true that feedbacks amplify the impact of increasing CO2. For example, if you increase CO2, you warm the planet. This causes ice to melt which causes the planet to reflect less solar energy back to space. This results in more solar energy being absorbed which causes further warming. Feedback mechanisms have been studied expensively and we know a lot about them. They certainly are not “dubious, unproven assumptions”.

Basic questions about the “sensitivity” of the climate to carbon dioxide have never been answered. Even Bailey admits this.

In recent years, there has [been] a lot of back and forth between researchers trying to refine their estimates of climate sensitivity. At the low end, some researchers think that temperatures would increase a comparatively trivial 1.5 degrees Celsius; on the high end, some worry it could go as high as high 6 degrees Celsius…. In a 2014 article in Geophysical Research Letters, a group of researchers calculated that it would take another 20 years of temperature observations for us to be confident that climate sensitivity is on the low end and more than 50 years of data to confirm the high end of the projections.

Response: It is true that there is a wide range of possibilities for how much warming we expect for a given amount of CO2 increase.

Well, fine then. Is it okay if we wait? (No, it isn’t, and I’ll get to the implications of that in a few moments.)

And this leaves out the possibility that the climate’s sensitivity to carbon dioxide is even lower, that other mechanisms such as cloud-formation might serve to dampen temperature increases.

Response: This possibility is “left out” because there is no evidence that it is a real possibility worth considering.

Recently, I was amused at news that new science is debunking the “low sodium” diet fad of the past few decades. It turns out that “the low levels of salt recommended by the government might actually be dangerous” (which is not so amusing). This seems like a timely warning. Like the human body, the global climate is a hugely complicated system with a lot of factors that interact. We’re not even close to understanding it all, and having the government jump in and pick sides risks cementing a premature “consensus.”

Response: I agree that the climate is extremely complex and there is always the potential for surprises. But there is a difference between not knowing everything and knowing nothing. The vast majority of scientists who study this issue would say that we know enough about climate to be confident that humans are currently causing warming that is above and beyond natural climate variability.

The immense, untamed complexity of the climate is reflected in the poor performance of computerized climate models, which leads us to our last major hurdle in proving the theory of global warming.

3) The ability to make forecasting models with a track record of accurate predictions over the very long term.

We don’t know whether current warming departs from natural variation, nor have scientists proven the underlying mechanisms by which humans could cause such an increase.

Response: Just to reiterate the points above, we do know that the warming over the past century departs from what would have happened without human greenhouse gas inputs and scientists absolutely understand the underlying physical mechanisms.

But even if we did know these things, we would have to be able to forecast with reasonable accuracy how big the effect is going to be. A very small warming may not even be noticeable or may have mostly salutary effects, such as a slightly longer growing season, whereas the impact of a much larger warming is likely to cause greater disruption.

I should also point out that the “catastrophic” part of “catastrophic anthropogenic global warming” is a much larger question that is even harder to forecast. For example, global warming was supposed to lead to more hurricanes, which is why movie posters for Al Gore’s An Inconvenient Truth featured a hurricane emerging from an industrial smokestack. Then hurricane activity in the Atlantic promptly receded to historical lows.

Response: I essentially agree with this point. The science on the connection between human-caused climate change and any change in hurricanes is fiercely debated in the scientific community and I believe it was scientifically irresponsible for An Inconvenient Truth to use the hurricane in its logo.

It’s pretty clear that scientists aren’t any good yet at making global climate forecasts. Current temperatures are at or below the low range of all of the climate models. Nobody predicted the recent 17-year-long temperature plateau. And while they can come up with ad hoc explanations after the fact for why the data don’t match their models, the whole point of a forecast is to be able to get the right answer before the data comes in.

Response: On the decade-to-decade timescale there are a lot of factors other than CO2 that effect the global temperature. That means that CO2 can go up for a couple decades while global temperatures remain flat. It also means that CO2 could remain flat for a couple decades while global temperatures warm. However, on longer timescales (e.g., 100 years) CO2 has a much more dominant impact on global temperature. It is this long timescale that is most important for climate policy.

Given the abysmal record of climate forecasting, we should tell the warmists to go back and make a new set of predictions, then come back to us in 20 or 30 years and tell us how these predictions panned out. Then we’ll talk.

Response: I assume that Tracinski is against climate change mitigation policy because he fears that regulating greenhouse gas emissions will negatively impact the economy. If that is the case, then it would be fair to point out that economic models have a poor track record of correctly forecasting the future. Therefore, I think Tracinski has created an unfair double standard. He demands that climate models have the ability to perfectly forecast the future while simultaneously giving a pass to economic models.

The bottom line is that the future is hard to predict. The precise amount of warming that we will get for a given change in CO2 is hard to predict and the economic impact of climate change mitigation policies are hard to predict. Any serious appraisal of a CO2 mitigation policy would take into account the uncertainty of both and would not unfairly pretend that one side of the balance has no uncertainty.

Ah, but we’re not going to be allowed to wait. And that’s one of the things that is deeply unscientific about the global warming hysteria. The climate is a subject which, by its nature, requires detailed study of events that take many decades to unfold. It is a field in which the only way to gain knowledge is through extreme patience: gather painstaking, accurate data over a period of centuries, chug away at making predictions, figure out 20 years later that they failed, try to discover why they failed, then start over with a new set of predictions and wait another 20 years. It’s the kind of field where a conscientious professional plugs away so maybe in some future century those who follow after him will finally be able to figure it all out.

Response: The science of climatology has already been going through this process for a long time. The greenhouse effect was discovered in the 1820s, the first scientific paper written about CO2’s impact on global temperature came out in 1896, and the first major assessment report on human-caused climate change came out in 1979.

Yet this is the field that has suddenly been imbued with the Fierce Urgency of Now. We have to know now what the climate will do over the next 100 years, we have to decide now, we have to act now. So every rule of good science gets trampled down in the stampede. Which also explains the partisan gap on this issue, because we all know which side of the political debate stands to benefit from the stampede. And it’s not the right.

Response: The reason people feel that this issue is so urgent is because in order to stabilize global temperature (stabilize not bring temperatures back down to where they were) we would need to reduce CO2 emissions by ~80% from their current level. However, CO2 emissions are only growing exponentially. All reasonable policy/economic outlooks say that it would take decades to stabilize CO2 emissions and many more decades to bring CO2 emissions down to a level to where global temperatures would stop rising. Because of these huge time-lags, avoiding large global warming in the coming centuries requires that we begin to reduce greenhouse gas emissions now.

Posted in Uncategorized | 4 Comments

Climate Model Primer

1) What is a climate model?

In order to correctly interpret climate model output it is important to first understand what a computer climate model is. Climate models are software that simulate the whole earth system by using our knowledge of physics at smaller spatial scales. Climate models break the entire earth into three-dimensional cubes (or grid boxes) and thousands of physical equations are used to simulate the state of each grid box as a function of time (Figure 1).

Because climate models are based on physics (as opposed to the statistics of past events), they can be used to make predictions that have no analogs in the historical record. For example, you could ask the question: “What would India’s climate be like if Africa didn’t exist?” Obviously you can’t use historical statistics to answer such a question but a climate model can make a reasonable prediction based on physics. Similarly, when a climate model projects the climate of 2100 under enhanced greenhouse gas concentrations, it is basing this projection on physics and not any historical analog.

11

Figure 1. Illustration of the three-dimensional grid of a climate model (from Ruddiman, 2000)

2) Future Climate Change Projections Based on Greenhouse Gas Scenarios

 Climate models are the primary tools used to project changes in the earth system under enhanced greenhouse gas concentrations. Climate models take information about greenhouse gas emissions or concentrations as an input and then simulate the earth system based on these numbers.

3) How greenhouse gas scenarios are input into the models

For CO2, a climate model that features a fully coupled “carbon cycle model” will only require anthropogenic emissions of CO2 as an input and it will predict the atmospheric concentrations of CO2. If a climate model does not have an embedded carbon cycle model, it will require atmospheric concentrations of CO2 as an input variable. The emissions and concentrations that are used as inputs to the model can be whatever the user desires.

4) Representation of greenhouse gasses in climate models

The most consequential effect of CO2 in a climate model is that it interacts differently with incoming shortwave radiation than it does with outgoing longwave radiation (this is essentially the definition of a greenhouse gas). The climate model keeps track of radiation in every atmospheric grid box (Figure 1) and at every time step. This is the primary way in which a change in greenhouse gasses will affect the climate of the model. As an example, if you double CO2 concentrations in a climate model, that model might predict that average rainfall intensity will increase over some arbitrary region. However, the only direct effect of changing CO2 concentration was that it changed outgoing longwave radiation. Any other change that the model simulates, e.g., the change in rainfall intensity, was secondary and came about due a series of physical links. These links are not necessarily obvious and many scientists spend their careers trying to figure them out.

5) Weather vs. Climate.

Climate is often colloquially defined as “average weather”. Therefore, in order to know the climate at a given location, you generally need weather records over a long period of time (usually 30 years or more). Weather is random and difficult to predict (it is nearly impossible to know if it will rain in New York on July 17th 2015). Climate, on the other hand, is relatively predictable (New York should expect about 120 mm of rain per July on average).

It is important to understand that climate models do not actually simulate climate. Instead, climate models simulate weather. For example, if you run a climate model from the year 2000 to the year 2100 (incorporating an estimated increase in greenhouse gas concentrations) the climate model will simulate the weather over the entire surface of the earth and output its calculations every three hours or so for 100 years. Therefore, the climate model will simulate the weather in New York on July 17th 2088. However, because weather is so unpredictable, nobody should take this “weather forecast” seriously. What may be taken seriously, however, are the long-term trends in July precipitation in New York over the entire course of the simulation.

6) How to interpret climate model output

Imagine that we have climate model output for some climate variable (at a given location) over the next 100 years under a scenario of increasing greenhouse gas concentrations (Figure 2A). We notice that the variable increases and peaks during “Time period A” and then decreases until “Time period B”. We want to know if this pattern is part of the climate change signal (i.e., did the increase in greenhouse gasses cause this pattern?) or if it is just the result of random weather noise. The only way to find out is to run the climate model again (allowing it to simulate different random weather) and see if the same pattern shows up again.

Figure 2B shows output from a 2nd climate model run which incorporated the same increase in greenhouse gas concentrations as the 1st. We see that, in this case, the variable is higher over “Time period B” than it was over “Time period A”. This indicates that the pattern we saw in the first run may have just been due to random weather noise and not a part of the climate change signal that we are most interested in. To be sure, we can look at a large number of different climate model runs that incorporate the same increase in greenhouse gas concentrations (Figure 2C). It becomes apparent that the common attribute between the runs is a long-term upward trend. If we have enough runs we can average them together to get an estimate of what the true climate change signal is (Figure 2D). Climate is the average weather so averaging over all of the models (taking the “multi-model-mean”) should leave only the portion of the change that is due to the increased greenhouse gas concentrations.

22

Figure 2. Illustration of the difference between the random weather produced by a single climate model run and the climate change signal that can only be seen when multiple climate model runs are averaged together.

7) Comparing the output of two different scenarios with different greenhouse increases

Figure 3A illustrates a hypothetical comparison of climate model output under two different greenhouse gas emissions scenarios. It would be difficult to interpret the results if we only had a single climate model run from each scenario because we might happen see a progression that is not representative of the underlying climate change signal. For example, it would be possible to see a green run that showed a larger increase in the variable than a blue run even though their respective climate change signals indicate the reverse. In order to compare the climate change signals between the two scenarios we need to average over a number of runs (Figure 3B). It is then possible to use the average and the spread about the average to test whether or not the difference in the emissions scenarios had a meaningful effect on the output variable at some time in the future.

33

Figure 3. Illustration of climate model output for two different emissions scenarios (blue and green) 

8) Issues to keep in mind when interpreting climate model output

  1. Signal to noise ratio may be very small

For many variables, the underlying climate change signal (thick lines in Figures 2 and 3B) may change only a tiny amount relative to how large the variations of the random weather noise is. This means that when we compare the multi-model-mean from one greenhouse gas emissions scenario to another, the difference we see may be due almost entirely to random weather noise instead of being due to some climate change signal.

  1. Climate change in the real world will not follow the smooth signal

There is random weather in the real climate system, just like there is in the individual climate model runs. So the true evolution of a variable in the future will not look like the smooth signal. Instead it will look like one of the individual erratic runs. This means that there is a very wide range of values that could be plausible for any given time in the future.

  1. It is difficult to validate climate models

Weather is unpredictable and thus we do not expect climate models to be able to reproduce the random weather that has occurred historically. Instead we expect climate models to be able to reproduce the underlying climate change signal that has occurred historically. Unfortunately, our observations over the past 50 to 100 years contain both random weather and a climate change signal but we don’t necessarily know which one is which.

For example, lets say that we observe an increase in drought at some location over the past 20 years. Then we run a set of climate models with historical increases in greenhouse gas concentrations and we see that the models do not suggest that there should have been an increase in drought in this location. This could mean one of two things: 1) the models are deficient and need to be improved before they can simulate this increase in drought or 2) the models are correct and the observed increase in drought was just due to random weather. It is very difficult to know which is the case and that makes it very difficult to assess how well the models are doing.

Since it is so hard to validate the climate models, it is difficult to know how confident we should be in their projections. Nevertheless their output is taken seriously because it probably represents our best quantitative estimate of the future based on our current knowledge of the physical workings of the climate system.

  1. Climate models are better at simulating larger spatial/temporal scales than smaller spatial/temporal scales

One problem with climate models is that their spatial and temporal resolution is too coarse to simulate many aspects of the climate that are relevant to society. For example, tornadoes are much smaller than the spatial grid (Figure 1) and much more short-lived than the time step (~3 hours) of a climate model so they cannot possibly be simulated. Even events such as floods tend to be caused by thunderstorms that are smaller-scale than the typical model’s resolution and thus it is difficult for models to correctly simulate the statistics of flooding events.

Posted in Uncategorized | Leave a comment

Global Warming and unforced variability: Clarifications on our recent study

Presentation1

This post appeared at Real Climate

We recently published a study in Scientific Reports titled “Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise”. Our study seemed to generated a lot of interest and we have received many inquires regarding its findings. We were pleased with some of coverage of our study (e.g., here) but we were disappointed that some outlets published particularly misleading articles (e.g, here, here, and here). Because there appears to be some confusion regarding our study’s findings, we would like to use this forum to clarify some points.

Our study is mainly about natural unforced (i.e., internally generated) variability in global mean surface temperature. This is the type of variability that comes from natural interactions between the ocean and the atmosphere (i.e., that due to phenomena like the El-Nino/Southern Oscillation or perhaps the Atlantic Multidecadal Oscillation). This is in contrast to externally forced variability in global mean surface temperature which arises due to changes in atmospheric greenhouse gasses, aerosols, solar irradiance, ect. Most previous estimates of the magnitude of unforced variability have come from physical climate models. In our study we created an alternative statistical estimate of unforced variability that was derived from reconstructed and instrumental surface temperature records. We then used this new estimate of unforced variability to aid in our interpretation of observed global mean temperature variability since 1900.

We found that unforced variability is large enough so that it could have accounted for multidecadal changes in the rate-of-increase of global average surface temperature over the 20th century. However, our estimate of unforced variability was NOT large enough to account for the total warming observed over the 20th century. Therefore, our results confirm that positive radiative forcings (e.g., from human-caused increases in greenhouse gas concentrations) are necessary in order for the Earth to have warmed as much as it did over the 20th century.

We also found that over the most recent decade-or-so, it is unlikely that the underlying global warming signal (the externally forced component of temperature change) has been increasing at a rate characteristic of the worst-case IPCC emissions scenario (more on this below).

This last finding is what generated most of the attention for our article but it would appear that the finding has been largely misinterpreted. Part of the confusion stems form the Duke University press release which used the headline “Global Warming More Moderate Than Worst-Case Models”. The news department created this headline as a replacement for our suggested headline of “Global Warming Progressing at Moderate Rate, Empirical Data Suggest”. The news department wanted a shorter headline that was easier for the public to understand. Unfortunately, the simplification led many to believe that our study made a forecast for moderate global warming in the future, when in fact our conclusion only applied to the recent past.

Below are some clarifications on specific questions that we have received:

Question: What does your study conclude about Climate Sensitivity (e.g., how much warming we expect for a given change in greenhouse gasses)?

Answer: Nothing. Our study was not concerned with assessing Climate Sensitivity and we have no particular reason to doubt the assessments of Climate Sensitivity from the IPCC.

Question: Does your study show that the climate models used by the IPCC are useless?

Answer: No. Results from our previous study indicated that the magnitude of unforced variability simulated by climate models may be underestimated on decadal and longer timescales and our new estimate of unforced variability largely supports this conclusion. However, our new statistical estimate of unforced variability is not radically different from that simulated by climate models and for the most part we find that climate models seem to get the ‘big picture’ correct.

Question: Does your study indicate that the warming from the 1970s to the present may be natural rather than human caused?

Answer: Our study is not explicitly an attribution study and we do not attempt to quantify the anthropogenic contribution to warming over the past ~40 years. However, we were interested in how much influence unforced variability might have had on changes in the rate of warming over the instrumental period. Specifically, we wanted to know if the real climate system is more like panel-a or panel-b below.

Screen Shot 2015-05-10 at 8.50.28 AM

In panel-a, the magnitude of unforced variability is small (represented by the narrow range between the blue lines), thus changes in the multidecadal rate of warming would necessarily be due to corresponding changes in the externally forced component of warming. In panel-b the magnitude of unforced variability is large (wide range between the blue lines) and thus changes in the multidecadal rate of warming could come about due to unforced variability.

The results of our study indicate that multidecadal changes in the rate of global warming can indeed come about due to unforced variability and thus the climate system may be more like panel-b than panel-a. This means that the accelerated warming over the last quarter of the 20th century does not ipso-facto require an acceleration in the forced component of warming. Instead, this accelerated warming could have come about due to a combination of anthropogenic forcing and unforced variability. This interpretation of the temperature record is consistent with the results of several recent studies.

Question: Does your study rule-out the rate of warming associated with the IPCC’s RCP 8.5 emissions scenario?

Answer: No. We used the multi-model mean warming associated with the RCP 8.5 emissions scenario (out to 2050) as a representation of the quickest rate of forced warming that could conceivably be occurring currently. We then asked the question “how likely is it that the forced component of global warming has been this steep, given recent temperature trends?” We found that it was not very likely to observe an 11-year warming hiatus (2002-2013 in GISTEMP) if the underlying forced warming signal was progressing at a rate characteristic the RCP 8.5 scenario. Since the mean radiative forcing progression in RCP 8.5 is likely steeper than the radiative forcing progression of the recent past, this finding cannot be used to suggest that models are overestimating the response to forcings and it cannot be used to infer anything about future rates of warming.

We would invite all interested people to read the full paper (it is Open Access) for a more complete explanation and discussion of this complicated subject.

Posted in Uncategorized | 2 Comments

Research highlighted in the media

Untitled

Our Paper, “Regions of significant influence on unforced global mean surface air temperature variability in climate models” has been highlighted by a number of media outlets:

sciencedaily

carbonbrief

scienceworldreport

duke

reportingclimatescience

phys

spacedaily

green-energy-news

rdmag

astrobiologymagazie

climatewire

Neue Zuercher Zeitung (Swiss newspaper)

Posted in Uncategorized | 1 Comment

Top-of-atmosphere contribution to unforced variability in global temperature

(The following was originally posted at Climate Lab Book)

As the attention received by the ‘global warming hiatus’ demonstrates, global mean surface temperature (T) variability on decadal timescales is of great interest to both the general public and to scientists. Here, I will discuss a recently published paper by my coauthors and I (Brown et al., 2014) that attempts to contribute to this scientific discussion by investigating the impact of unforced (internal) changes in the earth’s top-of-atmosphere (TOA) energy budget on decadal T variability.

Figure 1 illustrates a very simple (but hopefully still useful) way to think about T change. In this model, T changes as a result of an energy imbalance (Qnet) on the system composed of the land, atmosphere and ocean’s mixed layer. Specifically, T change results from an energy imbalance at the TOA (QTOA, which is the difference between Reflected Shortwave Radiation (RSW) plus Outgoing Longwave Radiation (OLR) and Incoming Shortwave Radiation (ISW)) and/or at the bottom of the ocean’s mixed layer (QBML, positive up). Obviously there has been a great deal of research on T change associated with externally forced changes in QTOA (Category A in Figure 1; Myhre et al., 2013). Also, there has been quite a bit of research recently on decadal T change resulting from unforced variability in the exchange of heat between the ocean’s mixed layer and the ocean below the mixed layer (Category C in Figure 1; England et al., 2014; Balmaseda et al., 2013; Meehl et al., 2013; Trenberth and Fasullo, 2013). In our reading of the literature, however, less attention has been given to decadal T change associated with unforced changes in the TOA energy budget (Category B in Figure 1). This is the topic of Brown et al., 2014.

I should note here that Spencer and Braswell, 2008 stressed the importance of non-feedback TOA radiation variability on T change but that this is a slightly different focus than our study because we were not concerned with distinguishing between feedback-related and non-feedback related TOA imbalances. The study that is most closely related to our own is probably Palmer and McNeall, 2014 and many of our results are complimentary to theirs.

Figure_1

Figure 1. Simple energy balance model of the primary causes of global mean surface temperature change. In this model, T change results from a net energy imbalance (Qnet) on the land/atmosphere/ocean-mixed-layer system (where Cm is the effective heat capacity of that system). Brown et al., 2014 is concerned with T variability due to Category B. Note that Category B variability may itself be a feedback (rather than a 1st cause) of T change. Variations of this class of energy balance model have been used in many climate change studies (Baker and Roe, 2009; Dickinson, 1981; Geoffroy et al., 2012; Held et al., 2010; Wigley and Raper, 1990; Wigley and Schlesinger, 1985).

We investigated unforced control runs from the CMIP5 archive and looked at what the TOA energy budget was doing during decades when the models spontaneously simulated large changes in T (since these were unforced runs, there was no Category A variability, by definition). We found that unforced, decadal changes in T tend to be enhanced by TOA energy imbalances. In other words, during large-magnitude warming decades, the net flux at the TOA tended to be into the climate system and during large-magnitude cooling decades the net flux at the TOA tended to be out of the climate system. How much did the TOA energy imbalances affect temperature? We used published effective heat capacities of many CMIP5 models (Cm in Figure 1) to estimate that the TOA flux during these decades was responsible for approximately half of the change in T on average (the other half would be due to QBML).

It may seem counterintuitive that the net flux at the TOA would enhance (rather than reduce) T change because the Stefan–Boltzmann law might lead us to expect that as T decreases (increases), the amount of outgoing longwave radiation emitted to space should decrease (increase) exponentially.

Figure 2 shows the temporal (Figure 2a) and spatial (Figure 2b) variability of temperature and energy flux variables over large-magnitude cooling decades. It can be seen that outgoing longwave radiation does in fact decrease during cooling decades but that the net TOA energy flux remains positive (out of the climate system) for most of the decade because reflected shortwave radiation tends to increase by roughly the same amount as outgoing longwave radiation decreases (Figure 2a). In other words, it appears that changes in the climate system’s albedo are able to temporarily counteract the changes in outgoing longwave radiation and thus sustain the net TOA imbalance for a longer period of time than might be expected otherwise. We find that these changes in albedo appear to be associated with changes in the state of the Interdecadal Pacific Oscillation (at least in the multi-model mean pattern). In particular, reflected shortwave radiation over the equatorial and eastern Pacific tends to increase as surface temperatures over that region decrease (Figure 2b).

Figure_2

Figure 2. a) Composite time series of temperature and energy flux variables over large-magnitude cooling decades (OLR: outgoing longwave radiation, RSW: reflected shortwave radiation, QTOA: upward oriented net TOA energy flux, T: global surface temperature, Nino3 T: surface air temperature over the Nino3 region, sfc net up: upward oriented net surface energy flux). The shading denotes the standard deviation of each value across all decades investigated (two largest-magnitude cooling decades from each of 36 CMIP5 control runs). b) Mean rates-of-change and anomalies of energy flux variables and surface temperatures over the same cooling decades shown in a. Stippling delineates the grid points where over 75% of the decades experienced the same signed value.

Something interesting about these model-based findings is that they appear to contradict observations over the past 10-15 years. In particular, it has been suggested that we are currently in an unforced cooling situation analogous to that illustrated in Figure 2 (e.g., England et al., 2014; Trenberth and Fasullo, 2013). These findings suggest that we should expect this unforced cooling to be enhanced by the net energy imbalance at the TOA (i.e., there should have been a decrease in the rate of climate system heat uptake over this period). However, our best inventories of total climate system heat content have indicated that just the opposite has occurred (as T has been in an unforced cooling state, the rate of climate system heat uptake as increased; Trenberth and Fasullo, 2013; Balmaseda et al., 2013). This is not what the CMIP5 models typically do, but it does happen (~13% of the cooling decades investigated were associated with net positive climate system heat uptake). In these rare decades, changes in QBML flux are large enough to overcome the gain in climate system energy and cause T cooling. This does seem consistent with the recent finding that an increase in Pacific trade wind strength has increased the rate of heat storage below the mixed layer (QBML) to unprecedented levels (England et al., 2014).

This study may raise more questions than it immediately answers and we hope to learn a great deal more as we dig into the results further.

Posted in Climate Change | Leave a comment

Making a case vs. analyzing data in the climate change debate

Nate Silver recently re-launched a greatly expanded version of his fivethirtyeight blog to much fanfare. The site’s goal is to tackle a variety of questions with hard data analysis in an effort to elucidate truths that are often obscured by opinion journalism.

In its first week, fivethirtyeight published a piece from Rodger Pielke Jr. (an environmental studies professor who focuses on climate impacts) in which it was argued that climate change is not causing increased economic losses. The essential argument was that disaster losses are increasing if you look at the raw data but the upward trend disappears if you correct for the fact that GDP is also increasing. In other words, total disaster losses have increased because we are getting richer not because climate related disasters have increased (this is based on Pielke’s own research).

This was a conclusion that stirred up quite a bit of controversy and Nate Silver stated in an interview on the Daily Show that because of the reaction (much of it negative) fivethrityeight would post a rebuttal to Pielke’s article. The rebuttal was written by professor of engineering and climate change blogger John Abraham. Apparently, however, fivethirtyeight changed its mind when they received the rebuttal so Abraham instead published it at the Huffington Post.

I think Abraham’s rebuttal is disappointing because it amounts to ‘making a case’ rather than analyzing data. The goal of fivethirtyeight, and science in general, should be to dig deep into the data, question methods and assumptions and attempt to get at the truth in a complicated word. Pielke’s original claim (that economic disaster losses are increasing because of increased wealth rather than increased disasters) seems well framed to be debated in the context of math and data analysis. I think it would be very interesting to have this debate. Abraham’s rebuttal, however, does not even attempt to do this as it does no data analysis at all and does not directly refute the original claim made by Pielke.  Instead it takes the form of a ‘persuasive essay’ in which Abraham aggregates as many studies as he can that support the general narrative that climate change is increasing disasters.

This is the strategy of political pundits and ideologues, and as scientists we should resist the temptation to use it. The problem with this strategy is that creating a well-cited narrative in support of your scientific view can be easy to do no matter what your view is. Take the following example:

Lets say that you want to paint the picture that global warming is drastically increasing hurricane activity:

Hurricanes derive their power from warm sea surface temperatures (Emanuel at al., 2005) and the oceans have been warming because of human greenhouse gas emissions (Abraham et al., 2013). Accordingly, there has been a dramatic upswing in hurricanes at the end of the 20th century (Holland and Webster, 2007). Hurricane tracks will continue to move northward (Graff and LaCasce, 2014) which makes events like Superstorm Sandy more likely. Basic physics predicts that as we continue to increase greenhouse gasses, hurricanes will get stronger (Emanuel, 1987) and climate models confirm this (Knutson and Tuleya, 2004). Additionally, hurricane damage will only be exacerbated by sea level rise (Woodruff et al., 2013).

On the other hand, let’s say that you want to paint the picture that global warming’s impact on hurricane activity is small or negative:

There has been no detectable trend in hurricane frequency over the twentieth century when you account for increased observational capabilities through time (Landsea, 2007). There is no straightforward connection between hurricane strength and sea surface temperatures (Swanson, 2008) and when we look at past records, hurricanes vary much more coherently with natural climate oscillations than with increasing greenhouse gasses (Chylek and Lesins, 2008).  Climate models predict that in the future, increased wind shear (Vecchi and Soden, 2007) will reduce hurricane frequency (Knutson et al., 2008). Finally, Superstorm Sandy had a trajectory that will become less likely under global warming (Barnes at al., 2013).

Notice how both of the paragraphs are cited with peer reviewed research and thus each may seem very authoritative on their own. The problem with this type of writing (and thinking) is that it sacrifices actual data analysis in favor of making the ‘most convincing case possible’. If we want to get closer to the truth we have to abandon this style of thinking in favor of carful analysis. We have to dig into the numbers, question the assumptions and understand why there are conflicting results in the literature in the first place.

Posted in Climate Change | Leave a comment