Attribution Science, Extreme Weather and Why They Matter

October 2, 2014 by

There is a love-hate relationship with the word “new.” It’s used far too often, but without it as a preface some of us don’t bother to pay attention to what’s coming next.

A “new task” scientists have grappled with in last few years is trying to figure out how extreme weather is being impacted by climate change.

Just how tough, and important, those efforts are turning out to be were outlined in a report published earlier this week in the Bulletin of the American Meteorological Society by scientists mostly from the National Oceanic and Atmospheric Administration.

The report’s title is “EXPLAINING EXTREME EVENTS OF 2013.” Then as a sort of afterthought: “From A Climate Perspective.”

Sixteen extreme weather events that occurred around the globe in 2013 were selected and 22 specific studies of those events by different teams of scientists were examined.

The report implicates climate change in several of the world’s most extreme weather events last year, including heat waves in Australia, Europe, China, Japan and Korea.

But beyond those findings there’s an underlying point to be taken from the report. Attribution science has a long way to go before it can be determined just how much extreme weather is being impacted by climate change.

For the unfamiliar, attribution science may be best explained as that gray area that lies between the climate deniers and Al Gore.

“The science remains challenging, but the environmental intelligence it yields for decision makers, we think, is invaluable because one guide to the future is what we understand about the past,” Thomas Karl, report coauthor and director of NOAA’s National Climatic Data Center, said during a conference about the report. “There is a lot of demand out there so we think this is an important activity and we hope to continue it into the future.”

The report’s findings indicate that human-caused climate change “greatly increased the risk for the extreme heat waves,” but how human influence has affected other perils, like droughts, heavy rain events, and storms, was not so clear.

Natural vari­ability likely played a much larger role in these former extremes, the report’s authors assert.

To help us better understand attribution science report coauthor Thomas Peterson, president of the World Meteorological Organization’s Commission for Climatology, spoke to Insurance Journal.

He offered an example of baseball player on steroids. Say the player noticeably bulks up and hits 20 percent more home runs during a particular period in his career. Then say you are in the stands, and you happen to catch one of his home run baseballs during that time.

Peterson asked: “Did you catch the ball because the baseball player was on steroids?”

The same type of query can be made for climate change.

“It’s a difficult thing to pull out,” Peterson said.

It’s the hope of Peterson and his fellow authors that the report fosters greater interest in attribution science, he said.

With more people and resources involved in attribution science, more data – and more importantly higher quality data – can be generated, their reasoning goes.

The report covered five perils: heat; cold; heavy precipitation; drought; storms.

Most of the studies cited in the report agreed that heat events were the easiest to attribute to climate change, with long-duration heat waves in the summer and prevailing warmth for annual conditions becoming as much as 10 times more likely “due to the cumula­tive effects of human-induced climate change.”

Storms were the least likely to be tied to climate change: “There was no clear evidence for human influence on any of the three very intense storms examined, which included a surprising winter-like storm during autumn in the Pyr­enees, an extreme blizzard across the U.S. High Plains, and Cyclone ‘Christian’ that delivered damaging winds across northern Germany and southern Denmark.”

Drought, like California’s, was a mixed bag. One study found the state’s drought to be driven by climate change while another study showed the opposite.

Peterson explained that all analyses aren’t the same.

“Science isn’t monolithic,” he said. “This isn’t engineering here. You can’t come up with exactly the same result each time.”

Because there was no signal in a particular study that global warming was behind something could mean authors were looking at different variables, or the data records may be too short for a good comparison.

Lack of good historical data gets Tom Larsen’s goat.

Larsen, senior vice president and product architect at catastrophe modeling firm CoreLogic EQECAT, recently indicated that the world is entering a “new normal” in terms of extreme weather risks.

Larsen made the statement at the annual Chartered Property Casualty Underwriter Society meeting in Anaheim, Calif. last week as he and a panel of cat modeling experts talked about extreme weather and how historical models alone may become a thing of the past.

At the meeting Larsen called for a shift from historic modeling to more predictive risk assessment methods. Many in Larsen’s field are now including conditional frequency models based on assumptions about ensuing weather patterns, he said, adding that such moves are leading to the adoption of tools like Tail-Value-at-Risk measurement, which quantifies an expected value of loss outside a given probability level.

Speaking today about the release of the Extreme Events of 2013 report Larsen emphasized how scant historical records are for extreme weather.

“We really only have about 100 to 150 years of good hurricane history,” he said, adding “and we can go through all perils.”

More and more of CoreLogic’s insurer and reinsurer customers are asking for more models, and better information on global weather catastrophe risk, than what has traditionally been put out, Larsen said.

“We are going to be offering more conditional models with these climate conditions included,” Larsen said.

Typically when a modeler delivers what’s known as an ensemble model it includes all possible events that can occur – Category 1 -5 hurricanes spanning Texas to Maine, for example – how often they may occur, and the probability of occurrence.

Those models tend to be attuned with what’s happened in the past and usually mirror historical models. But a conditional model proposes, for example, doubling the frequency of a particular event so someone can calculate how much damage will occur to their portfolio.

“It’s not what you’re going to see, it’s what you think you’re going to see,” Larsen said. “Like a Cat 4 running through downtown Houston and the likelihood of that.”