Tag Archives: Lloyds of London

Stressing the scenario testing

Scenario and stress testing by financial regulators has become a common supervisory tool since the financial crisis. The EU, the US and the UK all now regularly stress their banks using detailed adverse scenarios. In a recent presentation, Moody’s Analytics illustrated the variation in some of the metrics in the adverse scenarios used in recent tests by regulators, as per the graphic below of the peak to trough fall in real GDP.

click to enlargeBanking Stress Tests

Many commentators have criticized these tests for their inconsistency and flawed methodology while pointing out the political conflict many regulators with responsibility for financial stability have. They cannot be seen to be promoting a draconian scenario for stress testing on the one hand whilst assuring markets of the stability of the system on the other hand.

The EU tests have particularly had a credibility problem given the political difficulties in really stressing possible scenarios (hello, a Euro break-up?). An article last year by Morris Goldstein stated:

“By refusing to include a rigorous leverage ratio test, by allowing banks to artificially inflate bank capital, by engaging in wholesale monkey business with tax deferred assets, and also by ruling out a deflation scenario, the ECB produced estimates of the aggregate capital shortfall and a country pattern of bank failures that are not believable.”

In a report from the Adam Smith Institute in July, Kevin Dowd (a vocal critic of the regulator’s approach) stated that the Bank of England’s 2014 tests were lacking in credibility and “that the Bank’s risk models are worse than useless because they give false risk comfort”. Dowd points to the US where the annual Comprehensive Capital Assessment and Review (CCAR) tests have been supplemented by the DFAST tests mandated under Dodd Frank (these use a more standard approach to provide relative tests between banks). In the US, the whole process has been turned into a vast and expensive industry with consultants (many of them ex-regulators!) making a fortune on ever increasing compliance requirements. The end result may be that the original objectives have been somewhat lost.

According to a report from a duo of Columba University professors, banks have learned to game the system whereby “outcomes have become more predictable and therefore arguably less informative”. The worry here is that, to ensure a consistent application across the sector, regulators have been captured by their models and are perpetuating group think by dictating “good” and “bad” business models. Whatever about the dangers of the free market dictating optimal business models (and Lord knows there’s plenty of evidence on that subject!!), relying on regulators to do so is, well, scary.

To my way of thinking, the underlying issue here results from the systemic “too big to fail” nature of many regulated firms. Capitalism is (supposedly!) based upon punishing imprudent risk taking through the threat of bankruptcy and therefore we should be encouraging a diverse range of business models with sensible sizes that don’t, individually or in clusters, threaten financial stability.

On the merits of using stress testing for banks, Dowd quipped that “it is surely better to have no radar at all than a blind one that no-one can rely upon” and concluded that the Bank of England should, rather harshly in my view, scrap the whole process. Although I agree with many of the criticisms, I think the process does have merit. To be fair, many regulators understand the limitations of the approach. Recently Deputy Governor Jon Cunliffe of the Bank of England admitted the fragilities of some of their testing and stated that “a development of this approach would be to use stress testing more counter-cyclically”.

The insurance sector, particularly the non-life sector, has a longer history with stress and scenario testing. Lloyds of London has long required its syndicates to run mandatory realistic disaster scenarios (RDS), primarily focussed on known natural and man-made events. The most recent RDS are set out in the exhibit below.

click to enlargeLloyds Realistic Disaster Scenarios 2015

A valid criticism of the RDS approach is that insurers know what to expect and are therefore able to game the system. Risk models such as the commercial catastrophe models sold by firms like RMS and AIR have proven ever adapt at running historical or theoretical scenarios through today’s modern exposures to get estimates of losses to insurers. The difficulty comes in assigning probabilities to known natural events where the historical data is only really reliable for the past 100 years or so and where man-made events in the modern world, such as terrorism or cyber risks, are virtually impossible to predict. I previously highlighted some of the concerns on the methodology used in many models (e.g. on correlation here and VaR here) used to assess insurance capital which have now been embedded into the new European regulatory framework Solvency II, calibrated at a 1-in-200 year level.

The Prudential Regulatory Authority (PRA), now part of the Bank of England, detailed a set of scenarios last month to stress test its non-life insurance sector in 2015. The detail of these tests is summarised in the exhibit below.

click to enlargePRA General Insurance Stress Test 2015

Robert Childs, the chairman of the Hiscox group, raised some eye brows by saying the PRA tests did not go far enough and called for a war game type exercise to see “how a serious catastrophe may play out”. Childs proposed that such an exercise would mean that regulators would have the confidence in industry to get on with dealing with the aftermath of any such catastrophe without undue fussing from the authorities.

An efficient insurance sector is important to economic growth and development by facilitating trade and commerce through risk mitigation and dispersion, thereby allowing firms to more effectively allocate capital to productive means. Too much “fussing” by regulators through overly conservative capital requirements, maybe resulting from overtly pessimistic stress tests, can result in economic growth being impinged by excess cost. However, given the movement globally towards larger insurers, which in my view will accelerate under Solvency II given its unrestricted credit for diversification, the regulator’s focus on financial stability and the experiences in banking mean that fussy regulation will be in vogue for some time to come.

The scenarios selected by the PRA are interesting in that the focus for known natural catastrophes is on a frequency of large events as opposed to an emphasis on severity in the Lloyds’ RDS. It’s arguable that the probability of the 2 major European storms in one year or 3 US storms in one year is significantly more remote than the 1 in 200 probability level at which capital is set under Solvency II. One of the more interesting scenarios is the reverse stress test such that the firm becomes unviable. I am sure many firms will select a combination of events with an implied probability of all occurring with one year so remote as to be impossible. Or select some ultra extreme events such as the Cumbre Vieja mega-tsunami (as per this post). A lack of imagination in looking at different scenarios would be a pity as good risk management should be open to really testing portfolios rather than running through the same old known events.

New scenarios are constantly being suggested by researchers. Swiss Re recently published a paper on a reoccurrence of the New Madrid cluster of earthquakes of 1811/1812 which they estimated could result in $300 billion of losses of which 50% would be insured (breakdown as per the exhibit below). Swiss Re estimates the probability of such an event at 1 in 500 years or roughly a 10% chance of occurrence within the next 50 years.

click to enlarge1811 New Madrid Earthquakes repeated

Another interesting scenario, developed by the University of Cambridge and Lloyds, which is technologically possible, is a cyber attack on the US power grid (in this report). There have been a growing number of cases of hacking into power grids in the US and Europe which make this scenario ever more real. The authors estimate the event at a 1 in 200 year probability and detail three scenarios (S1, S2, and the extreme X1) with insured losses ranging from $20 billion to $70 billion, as per the exhibit below. These figures are far greater than the probable maximum loss (PML) estimated for the sector by a March UK industry report (as per this post).

click to enlargeCyber Blackout Scenario

I think it will be a very long time before any insurer willingly publishes the results of scenarios that could cause it to be in financial difficulty. I may be naive but I think that is a pity because insurance is a risk business and increased transparency could only lead to more efficient capital allocations across the sector. Everybody claiming that they can survive any foreseeable event up to a notional probability of occurrence (such as 1 in 200 years) can only lead to misplaced solace. History shows us that, in the real world, risk has a habit of surprising, and not in a good way. Imaginative stress and scenario testing, performed in an efficient and transparent way, may help to lessen the surprise. Nothing however can change the fact that the “unknown unknowns” will always remain.

Mega-Tsunami Fright Scenario

There was a nice piece on the online FT on the forces impacting the reinsurance sector last night. Lancashire, which is behaving oddly these days, was one of the firms mentioned. Lancashire looks like its set to drop by approximately 12% (the amount of the special dividend) when it goes ex-dividend after today the 28th (although yahoo has been shown it dropping by 10%-12% at the end of trading for several days now, including yesterday). If it does drop to a £5.50 level, that’s approximately a 123% price to diluted tangible book value. Quite a come down from the loftier valuations of 150%-170% under previous CEO Richard Brindle!

Anyway, this post is not about that. A major part of modern risk management in the insurance sector today is applying real life scenarios to risk portfolios to assess their impact. Lloyds’ has being doing it for years with their realistic disaster scenarios (RDS). Insurers are adept at using scenarios generating by professional catastrophic models from firms like RMS and AIR on so-called peak zones like US hurricanes or Japan earthquake. Many non-peak scenarios are not explicitly modelled by such firms.

The horrors of the tsunamis from the 2011 Tōhoku and the 2004 Indian Ocean earthquakes have been brought home vividly in this multi-media age. The damage in human terms from the receding waters full of debris makes the realities of such events all too real.  Approximately 80% of tsunamis come from earthquakes and history is littered with examples of large destructive tsunami resulting from earthquakes – the 1755 Great Lisbon earthquake in Portugal, the 1783 Calabrian and the 1908 Messina earthquakes in Italy, the 1896 Sanriku earthquake in Japan, the recently discovered 365 AD Mediterranean quake, the 1700 Cascadia Megathrust earthquake in the west coast of the US, and the 1958 Lituya Bay quake in Alaska are but a few examples.

Volcanoes are another potential cause of mega tsunamis as many volcanoes are found next to the sea, notably in countries bordering the Pacific Ocean, the northern Mediterranean and the Caribbean Sea.  One scenario put forward by a paper from Steven Ward and Simon Day in 2001 is the possibility of a mega tsunami from a collapse of an unstable volcanic ridge caused by previous Cumbre Vieja volcanoes in 1949 and 1971 in La Palma in the Canary Islands. The threat was has been dramatically brought to life by a 2013 BBC Horizon programme called “Could We Survive A Mega-Tsunami?”. Unfortunately I could not find a link to the full programme but a taster can be found here.

The documentary detailed a scenario where a future eruption could cause a massive landslide of 500 km3 of rock crashing into the sea, causing multiple waves that would travel across the Atlantic Ocean and devastate major cities along the US east coast, as well as parts of Africa, Europe and southern England & Ireland. The damage would be unimaginable, causing over 4 million deaths and economic losses of over $800 billion. The impact of the damage on port and transport infrastructure would also result in horrible after event obstacles to rescue and recovery efforts.

The possibility of such a massive landslide resulting from a La Palma volcano has been disputed by many scientists. In 2006, Dutch scientists released research which stipulated that the south west flank of the island was stable and unlikely to fall into the sea for at least another 10,000 years. More recent research in 2013, has shown that 8 historical landslides associated with volcanoes in the Canary Islands have been staggered in discrete landslides and that the likelihood of one large 500 km3 landslide is therefore extremely remote. The report states:

“This has significant implications for geohazard assessments, as multistage failures reduce the magnitude of the associated tsunami. The multistage failure mechanism reduces individual landslide volumes from up to 350 km3 to less than 100 km3. Thus although multistage failure ultimately reduce the potential landslide and tsunami threat, the landslide events may still generate significant tsunamis close to source.”

Another graph from the research shows that timeframe over which such events should be viewed is in the thousands of years.

click to enlargeHistorical Volcanic & Landslide Activity Canary Islands

Whatever about the feasibility of the events dramatised in the BBC documentary, the scientists behind the latest research do highlight the difference between probability of occurrence and impact upon occurrence.

“Although the probability of a large-volume Canary Island flank collapse occurring is potentially low, this does not necessarily mean that the risk is low. Risk is dependent both on probability of occurrence and the resultant consequences of such events, namely generation of a tsunami(s). Therefore, determining landslide characteristics of past events will ultimately better inform tsunami modelling and risk assessments.”

And, after all, that’s what good risk management should be all about. Tsunami are caused by large infrequent events so, as with all natural catastrophes, we should be wary that historical event catalogues may be a poor guide to future hazards.

Will the climate change debate now move forward?

The release of the synthesis reports by the IPCC – in summary, short and long form – earlier this month has helped to keep the climate change debate alive. I have posted (here, here, and here) on the IPCC’s 5th assessment previously. The IPCC should be applauded for trying to present their findings in different formats targeted at different audiences. Statements such as the following cannot be clearer:

“Anthropogenic greenhouse gas (GHG) emissions have increased since the pre-industrial era, driven largely by economic and population growth, and are now higher than ever. This has led to atmospheric concentrations of carbon dioxide, methane and nitrous oxide that are unprecedented in at least the last 800,000 years. Their effects, together with those of other anthropogenic drivers, have been detected throughout the climate system and are extremely likely to have been the dominant cause of the observed warming since the mid-20th century.”

The reports also try to outline a framework to manage the risk, as per the statement below.

“Adaptation and mitigation are complementary strategies for reducing and managing the risks of climate change. Substantial emissions reductions over the next few decades can reduce climate risks in the 21st century and beyond, increase prospects for effective adaptation, reduce the costs and challenges of mitigation in the longer term, and contribute to climate-resilient pathways for sustainable development.”

The IPCC estimate the costs of adaptation and mitigation of keeping climate warming below the critical 2oC inflection level at a loss of global consumption of 1%-4% in 2030 or 3%-11% in 2100. Whilst acknowledging the uncertainty in their estimates, the IPCC also provide some estimates of the investment changes needed for each of the main GHG emitting sectors involved, as the graph reproduced below shows.

click to enlargeIPCC Changes in Annual Investment Flows 2010 - 2029

The real question is whether this IPCC report will be any more successful that previous reports at instigating real action. For example, is the agreement reached today by China and the US for real or just a nice photo opportunity for Presidents Obama and Xi?

In today’s FT Martin Wolf has a rousing piece on the subject where he summaries the laissez-faire forces justifying inertia on climate change action as using the costs argument and the (freely acknowledged) uncertainties behind the science. Wolf argues that “the ethical response is that we are the beneficiaries of the efforts of our ancestors to leave a better world than the one they inherited” but concludes that such an obligation is unlikely to overcome the inertia prevalent today.

I, maybe naively, hope for better. As Wolf points out, the costs estimated in the reports, although daunting, are less than that experienced in the developed world from the financial crisis. The costs don’t take into account any economic benefits that a low carbon economy may result in. Notwithstanding this, the scale of the task in changing the trajectory of the global economy is illustrated by one of graphs from the report, as reproduced below.

click to enlargeIPCC global CO2 emissions

Although the insurance sector has a minimal impact on the debate, it is interesting to see that the UK’s Prudential Regulatory Authority (PRA) recently issued a survey to the sector asking for responses on what the regulatory approach should be to climate change.

Many industry players, such as Lloyds’ of London, have been pro-active in stimulating debate on climate change. In May, Lloyds issued a report entitled “Catastrophic Modelling and Climate Change” with contributions from industry. In the piece from Paul Wilson of RMS in the Lloyds report, they concluded that “the influence of trends in sea surface temperatures (from climate change) are shown to be a small contributor to frequency adjustments as represented in RMS medium-term forecast” but that “the impact of changes in sea-level are shown to be more significant, with changes in Superstorm Sandy’s modelled surge losses due to sea-level rise at the Battery over the past 50-years equating to approximately a 30% increase in the ground-up surge losses from Sandy’s in New York.“ In relation to US thunderstorms, another piece in the Lloyds report from Ionna Dima and Shane Latchman of AIR, concludes that “an increase in severe thunderstorm losses cannot readily be attributed to climate change. Certainly no individual season, such as was seen in 2011, can be blamed on climate change.

The uncertainties associated with the estimates in the IPCC reports are well documented (I have posted on this before here and here). The Lighthill Risk Network also has a nice report on climate model uncertainty which concludes that “understanding how climate models work, are developed, and projection uncertainty should also improve climate change resilience for society.” The report highlights the need for expanding geological data sets beyond short durations of decades and centuries which we currently base many of our climate models on.

However, as Wolf says in his FT article, we must not confuse the uncertainty of outcomes with the certainty of no outcomes. On the day that man has put a robot on a comet, let’s hope the IPCC latest assessment results in an evolution of the debate and real action on the complex issue of climate change.

Follow-on comment: Oh dear the outcome of the Philae lander may not be a good omen!!!

Smart money heading for the exits?

Private equity is rushing to the exits in London with such sterling businesses as Poundland and Pets at Home coming to the market. PE has exited insurance investments, following the successful DirectLine float, for names like Esure, Just Retirement, and Partnership. It was therefore interesting to see Apollo and CVC refloat 25% of BRIT Insurance last week after taking them off the market just 3 short years ago.

The private equity guys made out pretty good. They bought BRIT in 2011 for £890 million, restructured the business & sold the UK retail business and other renewal rights, took £550 million of dividends, and have now floating 25% of the business at a value of £960 million. To give them their due, they are now committing to a 6 month lock-up and BRIT have indicated a shareholder friendly dividend of £75 million plus a special dividend if results in 2014 are good.

I don’t really know BRIT that well since they have been given the once over by Apollo/CVC. Their portfolio looks like fairly standard Lloyds of London business. Although they highlight that they lead 50% of their business, I suspect that BRIT will come under pressure as the trend towards the bigger established London insurers continues. Below is a graph of the tangible book value multiples, based off today’s price, against the average three year calendar year combined ratio.

click to enlargeLondon Specialty Insurers NTA multiples March 2014

Another look across insurance cycles

Following on from a previous post on insurance cycles and other recent posts, I have been looking over the inter-relationship between insurance cycles in the US P&C market, the Lloyds of London market and the reinsurance market. Ideally, the comparisons should be done on an accident year basis (calendar year less prior year reserve movements) with catastrophic/large losses for 2001/2005/2011 excluded but I don’t (yet) have sufficient historical data to make such meaningful comparisons.

The first graph shows calendar year combined ratios in each of the three markets. The US P&C figures contain both consumer and commercial business and as a result are less volatile with the other markets. For example, Lloyds results are from specialty business classes like energy, marine, credit & surety, A&H, specialty casualty, excess and surplus (E&S) lines and reinsurance. The reinsurance ratios are those for most reinsurers as per S&P in their annual global reports. For good measure, I have also included the US real interest rates to show the impact that reduced investment income is having on the trend in combined ratios across all markets. Overall, ratios have been on a downward trend since the early 1990s. However, if catastrophic losses and reserve releases are excluded ratios have been on an upward trend since 2006 across Lloyds and the reinsurance markets. Recent rate increases in the US such as the high single digit rate increases in commercial property & workers comp (see Aon Benfield January report for details on US primary rate trends) may mean that the US P&C market comes in with a combined ratio below 100% for the full 2013 year (from 102% and 106% in 2012 and 2011 respectively).

click to enlargeInsurance Cycle Combined RatiosAs commented on above, the US P&C ratios cover consumer and commercial exposures and don’t fully show the inter-relationship between the different business classes across that market. The graph below shows the calendar year ratios in the US across the main business classes and paint a more volatile picture than the red line above.

click to enlargeUS Commercial Business Classes Combined Ratios