Tuesday, May 16, 2017

SINKING CITIES: Not Just at Sea Level Anymore

When the Mexica people founded the city of Tenochtitlan in the fourteenth century, they built it on an island in the center of an inland lake in the fertile Valley of Mexico. Over seven centuries, the population of the city – now called Mexico City – grew to 2 million in the 1940s, before exploding to over 20 million people today in the city and surrounding metropolitan area. With the population explosion came an exploding demand for water which, since the nineteenth century has been drawn from wells tapping the aquifer beneath the ancient lake. Today, many families in Mexico City pay dearly for the water trucks haul from those wells to their neighborhoods. But the entire city pays for that water in another way, as its buildings shift, settle, and sink into the ground.

A House of Cards
The geology of subsidence caused by groundwater extraction is analogous to a house of cards. As this graphic from TRTWorld illustrates, when the ground is full of water, the spaces between the cards are full, the cards are supported, and the city stands on top. Groundwater extracted at a moderate pace is replaced by water filtering back into the aquifer, maintaining stability. But when groundwater extraction increases to rates that lower the water level in the aquifer, the spaces between the cards are emptied, the cards collapse, and the city sinks.

Mexico City is a prime example. With thousands of wells sucking water from beneath it, most of the city sinks at a rate of a few centimeters per year. But some areas in Mexico City have sunk faster. The Mexico City Metropolitan Cathedral has tilted, underground sewers and water pipes havebroken, and the magnificent Palacio de Bellas Artes has sunk so far that its original ground floor is now the basement. 

Subsidence in Non-Coastal Areas – A Global Phenomenon

The global-warming narrative of rising sea levels, and the threat to coastal cities and their millions of inhabitants, have become familiar. Also familiar is the fact that many of these same cities are sinking into the soft coastal soils they’re built on, creating a double threat. (New Orleans, for instance, was sinking at a rate of about an inch per year in 2006, and its levees were sinking even faster.) Less well known is the phenomenon of non-coastal cities subsiding as extraction of water from the aquifers beneath collapses the geology that supports them. This slow-motion catastrophe (which by geological standards is hurtling along at breakneck speed) threatens the stability of buildings and infrastructure on a sobering scale, and makes these non-coastal cities more vulnerable to the same natural disasters that threaten their coastal cousins.

Mexico’s sprawling capital, which lies a mile and a half above sea level, is not alone. Wherever surface water is scarce, and groundwater is the primary source, subsidence tends to follow. Researchers at Arizona State University have reported that parts of Phoenix, which is more than 1,000 feet above sea level, are sinking due to groundwater extraction. They predict that, as the phenomenon progresses, fissures forming in the ground will threaten canals, utility lines, water mains, storm drains, sewers and building foundations.
German geoscientists studying subsidence in Iran have noted that in Tehran, which is located almost 4,000 feet above sea level, subsidence has caused cracks in buildings, roads, and pipelines. Geotechnical studies have found that Bogota, Columbia, at 8,700 feet, is sinking about an inch per year due to extraction of groundwater. The phenomenon has been noted in many other non-coastal cities well above sea level –  including Dehli, India; Sao Paulo, Brazil; Las Vegas; and Riyadh, Saudi Arabia – to name a few.

Subsidence Relates to Other Risks

Physical damage to buildings and infrastructure caused by subsidence and earth movement is typically excluded from coverage under first-party property insurance policies. However, insurers should be aware that subsidence can increase the risk of physical damage caused by other perils, and monitoring subsidence trends may be one way for insurers to gauge increasing exposure risks. For instance, scientists studying subsidence in Houston have found that groundwater extraction caused the soils beneath the 6,000-acre Meyerland neighborhood to sink about 18 inches during the 1980s and 90s, which is more than in the surrounding areas. The storm sewers designed to drain the area sunk as well, decreasing their capacity. As a result, heavy rains are more likely to pool in Meyerland, and less likely to drain away without causing flood damage. The appearance of the Meyerland basin shows that insurers who provide coverage for flood damage would be wise to monitor subsidence trends, even if their policies exclude coverage for physical damage caused by subsidence, because the presence of the excluded peril may indicate an increased risk of physical damage that could be caused by or result from a covered peril. Meyerland also provides an example of the complex causation analysis that could arise where subsidence, an excluded peril, arguably causes a covered peril to occur like flooding – which results in physical damage.

Insurers who provide earthquake coverage should also keep tabs on subsidence trends as an indicator of potential exposure. The house-of-cards analogy above illustrates how soils saturated with groundwater are inherently more stable. Soils destabilized by groundwater extraction, on the other hand, are more likely to shift – and to shift more dramatically – when an earthquake strikes. That means that for a quake of a given magnitude, the resulting physical damage will likely be greater in areas where subsidence has occurred because subsidence is a manifestation of less stable subsurface geology. Thus, it may be possible to anticipate increases in earthquake-coverage risk exposures by monitoring subsidence caused by extraction of groundwater in earthquake-prone areas.


Broad scientific agreement that groundwater extraction causes subsidence has led to broad agreement on another point: subsidence is here to stay. The only proven way to slow or stop it is to slow or stop groundwater extraction. But population growth in arid urban areas shows no signs of slowing, and demand for water will inevitably increase. As this cycle accelerates, insurers will face increasing – and increasingly uncertain – risks. Monitoring subsidence trends in both coastal and non-coastal urban areas could be highly useful in coping with the increased risk exposures presented by subsidence.

Tuesday, May 9, 2017

First Quarter Numbers for Property Insurers Look Bleak, But is it Too Early to Call it a Trend?

The insurance industry has been able, in the past few years, to avoid a string of major catastrophes like those in the early 2000’s.  One might think that property insurers have gotten off easy. Appearances can be deceiving, however, as the industry has been battered by smaller, more frequent, events.  In fact, Fox Business recently reported that the first quarter of 2017 was the most expensive quarter for U.S. property insurers in more than 20 years. The cause for such dreary numbers can be attributed to severe weather—namely tornadoes, flooding, hail, and ice storms. As of early April 6, 2017, there have already been 5 weather and climate disaster events in the U.S., with losses exceeding $1 billion each.  The average for the most recent 5 years (2012–2016) is 10.6 severe weather events per year. In total, it is estimated that insurers have paid approximately $6 billion in weather-related claims so far this year compared to an estimated $4.5 billion as of the same time last year.

It is also worth noting that the first quarter of the year typically yields fewer catastrophe-related damages because hurricane season does not start until the second quarter. However, there were over 400 tornadoes from January through March 2017, compared to 205 during the same period in 2016. When the final tornado count for March is determined, the month will likely rank among the 10 most active Marches for tornadoes. Moreover, hail continues to show that it has devastating economic effects for both commercial and personal lines. 
With more than double the number of tornadoes than in 2016 and an increase in hail claims, insurers are bracing themselves for an increase in weather-related claims as we enter the second quarter.  According to NOAA, the contiguous U.S. had the second wettest April in over 123 years.  Moreover, hail is most likely to fall in the U.S. during the months of May and June. With the hail and tornado season underway and the hurricane season around the corner, weather related-claims should continue to roll in during the second quarter. 
Yet, it may not be all doom and gloom for property insurers, as forecasters are calling for a below-normal hurricane season this year. In fact, only 11 named storms are predicted for the U.S. for 2017. Of these named storms, only four are predicted to become hurricanes and only 2 are predicted to be Category 3 or higher (Category 3 hurricanes have sustained winds of 111 miles per hour.). This is slightly below the 30-year average of 12 named storms and six hurricanes per hurricane season. If these predictions are accurate, insurers will have El Niño to thank, since warmer-than-normal ocean water temperatures in the Pacific Ocean may limit the development of storms in the Atlantic. But as we all know from past hurricanes such as Andrew, Katrina, and Sandy, it only takes one named storm to have significant economic impact on the industry.
Insurers have no choice but to brave the storms to see how the 2nd and 3rd quarter numbers shape up. Mother Nature will always be unpredictable, and she will sometimes be violent. Although storm chasers will keep searching for the next big natural disaster, insurers should keep an eye on the smaller, more frequent, events that can be just as damaging to their loss ratios.  The first quarter 2017 numbers have already told their cautionary tale.

Published by Jennifer Hoffman

Wednesday, May 3, 2017

The Deepwater Horizon Catastrophe – 7 Years After

I recently worked with a client on a presentation concerning the Deepwater Horizon disaster – an epic tragedy in 2010 that claimed the lives of 11 men, injured 17 others and poured 4.9 million barrels of oil into the Gulf of Mexico. The financial costs to BP have totaled about $60 billion, not counting the long-term loss of more than a quarter of BP’s market capitalization. It remains the worst off-shore oil spill in US history.

Revisiting the details of what happened, and why, was both tragic and fascinating.  It was a reminder that despite all our technological prowess and cutting-edge this-and-that, the forces of nature are humbling. Even when those forces are in the form of long-dead dinosaurs buried 5 miles under the sea. It is also a reminder that catastrophes like this are almost never the result of a single point of failure: usually multiple safety systems, human and otherwise, fail in an event of this magnitude.

The Deepwater Horizon disaster began on April 20, 2010, at approximately 9:38 pm CDT, when the Macondo well began surging hydrocarbons into the drill riser. The reverse flow of drilling mud, gas and oil started slowly, then erupted into a geyser that ignited and burned for two days, ultimately sinking the Deepwater Horizon.

Drilling into the Past

The task of the Deepwater Horizon, a giant floating ship in the form of a drilling platform, was audacious. Owned by Transocean, the Deepwater Horizon was tasked with drilling the Macondo well, leased by BP. The drilling took place about fifty miles off the coast of Louisiana, at an ocean depth of a mile. But the prize – a vast potential reserve of oil and gas – lay much deeper; about three-and-a-half miles under the sea floor. The operators were drilling four-and-a-half miles beneath them, into an ancient world that they could never see.

The task of drilling the well was complex and tedious. It involved repeated cycles of installing lengths of drill pipe – almost five miles’ worth – to achieve the desired depth. During the drilling process, a combination of seawater and specially formulated drilling “mud” was pumped through the drill bit to remove debris and to keep the well pressure at appropriate levels to prevent hydrocarbons from flowing up into the well bore. Along the way, at various intervals, drilling was stopped and the well bore was “cemented in” with steel casing. The cementing process was critical – if the cement failed, hydrocarbons could leak into the annulus – the space between the drill pipe and the casing – or potentially into the casing itself, leading to a blowout.

A blowout preventer (BOP) supplied by Cameron was thought to be the ultimate failsafe. More than 160 feet high, the BOP sat on the sea floor and had multiple systems intended to seal the well and prevent a blowout. They included an upper annular closure, a lower annular closure and a blind shear ram. The annular closures were designed to seal around the drill pipe. The blind shear ram was a hydraulic ram of last resort – a set of cutting blades designed to cut through the steel drill pipe and seal the well entirely. The BOP even had two “dead man” switches – intended to operate the blind shear ram in the event of a loss of hydraulic power, electricity and communications. All of this technology ultimately failed.

The Blowout

At the time of the disaster, the Macondo well was in the process of being temporarily “abandoned” by the Deepwater Horizon. Having drilled the well, the Deepwater Horizon’s final job was to seal it with concrete and leave for its next drilling job. When BP was ready to produce oil and gas from the well, it would send in a production platform to reopen the well.

But the Macondo well had been a problem for BP and Transocean. Multiple delays and unforeseen issues had put the project more than 50 days behind schedule – at a cost of $1 million per day to BP. Workers had reportedly named it the “well from hell.” As they were preparing to leave, things got worse.

Because the well was being drilled in relatively weak geological strata, BP and its cementing contractor Halliburton elected to use a lighter “foamed” cement in the lower portions of the well. The final step was to pour a concrete plug or cap at the bottom of the well using a stronger and heavier liquid cement. Multiple investigative reports suggest that the foamed cement failed, and that the heavier liquid cement never made it to the bottom to plug the well.

Despite questionable results from a negative pressure test intended to determine whether the well was properly sealed, officials on site decided the well integrity was satisfactory, and operators began the process of removing the heavy drilling mud and replacing it with seawater. At that point, hydrocarbons began flowing up through the casing, through the BOP and up into the riser at the surface.

By the time the crew realized what was happening, the Deepwater Horizon was enveloped in a cloud of combustible hydrocarbons. A methane detection system sounded an alarm, but only shut down some of the potential ignition sources. Gas – odorless and colorless –  entered the air intakes for the massive diesel engines that powered the Deepwater Horizon. This initially caused the engines to overspeed – popping light bulbs and overpowering the electrical system – and then stop. Seconds later, explosions occurred on two of the generators, igniting the firestorm that ensued.

Multiple Points of Failure

Numerous investigations followed. Each of the major investigative reports views the events through a slightly different lens, but fundamentally there is broad agreement on the technical causes. Some of the published reports are highly critical of BP’s risk assessment process and decisions made by BP on the basis of cost or time savings. BP’s own comprehensive report on the incident identifies eight key findings that caused or contributed to the catastrophe, which are largely echoed echoed elsewhere:
  1. The failure of the cement barrier to isolate the hydrocarbon
  2. The failure of check valves inside the casing to prevent the backflow of hydrocarbons into the well bore
  3. Improper interpretation of a critical negative pressure test, leading to the erroneous conclusion that well integrity had been established
  4. Failure to recognize the influx of hydrocarbons in the well until they reached the riser
  5. Inadequate well control response by the crew
  6. Diversion of the blowout to the mud gas separator resulted in gas venting onto the rig
  7. The fire and gas detection system failed to prevent ignition
  8. The BOP emergency mode did not seal the well
Each of these issues is detailed in the BP report (and others), and other investigative reports emphasize or de-emphasize particular aspects. But it is undisputed that the Deepwater Horizon catastrophe was a result of multiple failures – system, technological and human.

The Aftermath

Families lost loved ones and many others were scarred by the events. The recent Deepwater Horizon movie was based on a New York Times article that detailed some of the heroics and human toll. Seven years later, BP is still fending off lawsuits – at this point, primarily actions by institutional shareholders relating to alleged securities violations. Multiple environmental assessments have detailed the impacts to the Gulf of Mexico. The financial costs to BP and other involved parties have been staggering.

The history of human technological progress is one of spectacular advances and catastrophic failures. As we continue to race forward at breakneck speed, it is critical and humbling to reflect on events of the past -- in the hope that future catastrophes can be avoided or minimized, but with the painful awareness that we will likely have future lessons to learn.  

Publish by Thomas Cook