A Tale of Two Californias: Managing Wildfire Risk in the Year 2030

February 6, 2020 by

For most of the 20th century, the insurance industry considered wildfires to be little more than a benign nuisance. They occurred frequently but rarely resulted in more than a handful of claims, and underwriters priced for them in the same way as other attritional sources of loss like theft, breakage and sewer back-up.

Then in 1991, everything changed.

The 1991 Oakland Hills Fire ($4 billion insured loss) provided the first glimpse of the horrors that a wildfire could bring, killing 25 people and destroying 2,900 structures in less than a day.

Wildfire losses accelerated over the next 25 years with the Cedar, Old, Witch, Butte, and Valley fires collectively destroying 8,300 structures and resulting in more than $5.7 billion in insured damages.

And then came the 2017 and 2018 seasons.

Words fail to describe extent of devastation during this period, when 11 huge fires wiped out entire neighborhoods and cities. In the end, the toll of these two seasons stood at 135 deaths and more than $30 billion insured damage suffered in all corners of California, from the mountain cities of the Northern Sierras, to the rolling hills of Wine Country, to the seaside towns of the Central and Southern coasts.

Now, everything has changed again.

With a warming climate, a growing population in high risk areas, and an overgrowth of burnable vegetation, wildfires aren’t going away — and we must adapt to this reality. There is a monumental amount of work to be done to improve our wildfire safety, data and analytics, emergency response, and community resilience. We must begin now.

Ten years from now in 2030, California faces one of two futures. From the perspective of an analytics provider to the insurance industry, I present you both.

The first is what I’ll call the Good California, where we rise to occasion of the challenges that face us today.

In this version of the future, insurers have fundamentally changed their approach to underwriting wildfire risk.

Homeowners are given premium credits for safe behaviors like clearing vegetation, cleaning gutters and screening vents (as well as penalties for the opposite). These credits and penalties are enabled by two technologies: first, computer vision algorithms that read satellite and aircraft imagery to detect these safe behaviors; second, catastrophe models which quantify their impact.

In the good California, property data is abundant and accurate, enabling real-time underwriting decisions for most submissions. Wildfire is treated as a peak peril, and its cat modeling is done with the same scientific rigor as hurricane, earthquake and flood.

The impact of climate change – and its statistical uncertainty – is baked into the cat models so we can better understand loss outcomes as the peril evolves into the future. The models themselves are intensively validated by teams of experts and regulators. And the California Department of Insurance fully embraces the usage of data, analytics and cat models to support rate filings.

The fire insurance market is competitive and well capitalized in the Good California. Excess and surplus lines and the California FAIR Plan play balanced roles as capacity providers for high risk homes. Homeowners are hyper-aware of fire risk, and they contribute to improving wildfire safety in partnership with the government and the insurance industry.

The Bad California, as I’ll call it, is what happens when these things fail to occur.

In this version of California, insurers continue to use traditional mapping and hazard-scoring tools to underwrite wildfire risk, incorrectly identifying high risk areas as low risk, and vice versa. Hamstrung by regulations that do not fully recognize the power of models and data in predicting risk, many insurers refuse entire ZIP codes.

They apply “broad-brush” underwriting rules that do not adequately distinguish safe properties from dangerous ones. Fire analytics and data have advanced, but they have not been fully infused into all operations of insurers across underwriting, portfolio management, and risk transfer.

Because of this, the fire insurance market in the Bad California is a bifurcated one, with admitted carriers avoiding fire-prone areas, and E&S players with opportunistic exorbitantly priced offerings. Fire insurance becomes largely unaffordable to the working poor in small, high risk mountain communities. To those who can afford it, it is often one of their biggest household expenditures.

Insurers’ understanding of wildfire risk trails behind its technical expertise in other peak perils such as earthquake and hurricane, and because of this market capacity remains an ongoing challenge.

Insurers face many critical decisions over the next several years about they understand, manage and underwrite wildfire risk.

But if history is any indication, we are up to the task. The industry faced a similar crisis after six hurricanes pummeled Florida during the 2004 and 2005 seasons; since then hurricane risk analytics and mitigation have improved substantially.

The same can be said about earthquake analytics in the wake of the Loma Prieta Earthquake (1989); terrorism modeling after the 9/11 attacks (2001), storm surge modeling following Hurricane Katrina (2005), and tsunami risk quantification after the Tohoku Earthquake (2011).

Today we must confront the difficulties of wildfire. I believe with the right mix of legislation, community planning, and data analytics we can make California a safer and more resilient place where fire insurance is affordably purchased and equitably priced.

I look forward to facing this challenge.

Folkman is senior director of model product management at RMS. Email: chris.folkman@rms.com; Phone: (510) 505-2500.

Related: