Stressing the scenario testing

Scenario and stress testing by financial regulators has become a common supervisory tool since the financial crisis. The EU, the US and the UK all now regularly stress their banks using detailed adverse scenarios. In a recent presentation, Moody’s Analytics illustrated the variation in some of the metrics in the adverse scenarios used in recent tests by regulators, as per the graphic below of the peak to trough fall in real GDP.

click to enlargeBanking Stress Tests

Many commentators have criticized these tests for their inconsistency and flawed methodology while pointing out the political conflict many regulators with responsibility for financial stability have. They cannot be seen to be promoting a draconian scenario for stress testing on the one hand whilst assuring markets of the stability of the system on the other hand.

The EU tests have particularly had a credibility problem given the political difficulties in really stressing possible scenarios (hello, a Euro break-up?). An article last year by Morris Goldstein stated:

“By refusing to include a rigorous leverage ratio test, by allowing banks to artificially inflate bank capital, by engaging in wholesale monkey business with tax deferred assets, and also by ruling out a deflation scenario, the ECB produced estimates of the aggregate capital shortfall and a country pattern of bank failures that are not believable.”

In a report from the Adam Smith Institute in July, Kevin Dowd (a vocal critic of the regulator’s approach) stated that the Bank of England’s 2014 tests were lacking in credibility and “that the Bank’s risk models are worse than useless because they give false risk comfort”. Dowd points to the US where the annual Comprehensive Capital Assessment and Review (CCAR) tests have been supplemented by the DFAST tests mandated under Dodd Frank (these use a more standard approach to provide relative tests between banks). In the US, the whole process has been turned into a vast and expensive industry with consultants (many of them ex-regulators!) making a fortune on ever increasing compliance requirements. The end result may be that the original objectives have been somewhat lost.

According to a report from a duo of Columba University professors, banks have learned to game the system whereby “outcomes have become more predictable and therefore arguably less informative”. The worry here is that, to ensure a consistent application across the sector, regulators have been captured by their models and are perpetuating group think by dictating “good” and “bad” business models. Whatever about the dangers of the free market dictating optimal business models (and Lord knows there’s plenty of evidence on that subject!!), relying on regulators to do so is, well, scary.

To my way of thinking, the underlying issue here results from the systemic “too big to fail” nature of many regulated firms. Capitalism is (supposedly!) based upon punishing imprudent risk taking through the threat of bankruptcy and therefore we should be encouraging a diverse range of business models with sensible sizes that don’t, individually or in clusters, threaten financial stability.

On the merits of using stress testing for banks, Dowd quipped that “it is surely better to have no radar at all than a blind one that no-one can rely upon” and concluded that the Bank of England should, rather harshly in my view, scrap the whole process. Although I agree with many of the criticisms, I think the process does have merit. To be fair, many regulators understand the limitations of the approach. Recently Deputy Governor Jon Cunliffe of the Bank of England admitted the fragilities of some of their testing and stated that “a development of this approach would be to use stress testing more counter-cyclically”.

The insurance sector, particularly the non-life sector, has a longer history with stress and scenario testing. Lloyds of London has long required its syndicates to run mandatory realistic disaster scenarios (RDS), primarily focussed on known natural and man-made events. The most recent RDS are set out in the exhibit below.

click to enlargeLloyds Realistic Disaster Scenarios 2015

A valid criticism of the RDS approach is that insurers know what to expect and are therefore able to game the system. Risk models such as the commercial catastrophe models sold by firms like RMS and AIR have proven ever adapt at running historical or theoretical scenarios through today’s modern exposures to get estimates of losses to insurers. The difficulty comes in assigning probabilities to known natural events where the historical data is only really reliable for the past 100 years or so and where man-made events in the modern world, such as terrorism or cyber risks, are virtually impossible to predict. I previously highlighted some of the concerns on the methodology used in many models (e.g. on correlation here and VaR here) used to assess insurance capital which have now been embedded into the new European regulatory framework Solvency II, calibrated at a 1-in-200 year level.

The Prudential Regulatory Authority (PRA), now part of the Bank of England, detailed a set of scenarios last month to stress test its non-life insurance sector in 2015. The detail of these tests is summarised in the exhibit below.

click to enlargePRA General Insurance Stress Test 2015

Robert Childs, the chairman of the Hiscox group, raised some eye brows by saying the PRA tests did not go far enough and called for a war game type exercise to see “how a serious catastrophe may play out”. Childs proposed that such an exercise would mean that regulators would have the confidence in industry to get on with dealing with the aftermath of any such catastrophe without undue fussing from the authorities.

An efficient insurance sector is important to economic growth and development by facilitating trade and commerce through risk mitigation and dispersion, thereby allowing firms to more effectively allocate capital to productive means. Too much “fussing” by regulators through overly conservative capital requirements, maybe resulting from overtly pessimistic stress tests, can result in economic growth being impinged by excess cost. However, given the movement globally towards larger insurers, which in my view will accelerate under Solvency II given its unrestricted credit for diversification, the regulator’s focus on financial stability and the experiences in banking mean that fussy regulation will be in vogue for some time to come.

The scenarios selected by the PRA are interesting in that the focus for known natural catastrophes is on a frequency of large events as opposed to an emphasis on severity in the Lloyds’ RDS. It’s arguable that the probability of the 2 major European storms in one year or 3 US storms in one year is significantly more remote than the 1 in 200 probability level at which capital is set under Solvency II. One of the more interesting scenarios is the reverse stress test such that the firm becomes unviable. I am sure many firms will select a combination of events with an implied probability of all occurring with one year so remote as to be impossible. Or select some ultra extreme events such as the Cumbre Vieja mega-tsunami (as per this post). A lack of imagination in looking at different scenarios would be a pity as good risk management should be open to really testing portfolios rather than running through the same old known events.

New scenarios are constantly being suggested by researchers. Swiss Re recently published a paper on a reoccurrence of the New Madrid cluster of earthquakes of 1811/1812 which they estimated could result in $300 billion of losses of which 50% would be insured (breakdown as per the exhibit below). Swiss Re estimates the probability of such an event at 1 in 500 years or roughly a 10% chance of occurrence within the next 50 years.

click to enlarge1811 New Madrid Earthquakes repeated

Another interesting scenario, developed by the University of Cambridge and Lloyds, which is technologically possible, is a cyber attack on the US power grid (in this report). There have been a growing number of cases of hacking into power grids in the US and Europe which make this scenario ever more real. The authors estimate the event at a 1 in 200 year probability and detail three scenarios (S1, S2, and the extreme X1) with insured losses ranging from $20 billion to $70 billion, as per the exhibit below. These figures are far greater than the probable maximum loss (PML) estimated for the sector by a March UK industry report (as per this post).

click to enlargeCyber Blackout Scenario

I think it will be a very long time before any insurer willingly publishes the results of scenarios that could cause it to be in financial difficulty. I may be naive but I think that is a pity because insurance is a risk business and increased transparency could only lead to more efficient capital allocations across the sector. Everybody claiming that they can survive any foreseeable event up to a notional probability of occurrence (such as 1 in 200 years) can only lead to misplaced solace. History shows us that, in the real world, risk has a habit of surprising, and not in a good way. Imaginative stress and scenario testing, performed in an efficient and transparent way, may help to lessen the surprise. Nothing however can change the fact that the “unknown unknowns” will always remain.

Chinese Web

In a previous post last December, I had a quick look at the valuations of a few Chinese internet stocks that are traded in the US, solely for curiosity’s sake. At that time, I mused that Google (GOOG) may be a better bet than any of the Chinese high growth/high risk plays given its valuation. The one maybe I highlighted amongst the Chinese internet stocks was Baidu (BIDU), the so called Chinese Google. It is somewhat ironic that BIDU today fell 15% after disappointing results from higher expenses and lower revenue projections whilst GOOG, which had a great quarter due to revenue growth and squeezed expenses, is up over 20% since its Q2 results. Just shows what I know!

Given the drama in the Chinese stock market, I had another quick look over the Chinese internet stocks to see how they are performing, as per the graph below.

click to enlargeChinese Internet Stocks July 2014 to 2015

It is far too early to tell what the impact of the current turmoil will have on the Chinese consumer and on the Chinese internet sector (if any, given the government’s current policy of propping up the market). At this stage, it is interesting to see that it’s NetEase, primarily in the online game services sector, which has stood up the best so far, up 40% this year. That just confirms to me how far these stocks are outside my comfort zone.

Insurers keep on swinging

In a previous post, I compared the M&A action in the reinsurance and specialty insurance space to a rush for the bowl of keys in a swingers party. Well, the ACE/Chubb deal has brought the party to a new level where anything seems possible. The only rule now seems to be a size restriction to avoid a G-SIFI label (although MetLife and certain US stakeholders are fighting to water down those proposals for insurers).

I expanded the number of insurers in my pool for an update of the tangible book multiples (see previous post from December) as per the graphic below. As always, these figures come with a health warning in that care needs to be taken when comparing US, European and UK firms due to the differing accounting treatment (for example I have kept the present value of future profits as a tangible item). I estimated the 2015 ROE based upon Q1 results and my view of the current market for the 2011 to 2015 average.

click to enlargeReinsurers & Specialty Insurers NTA Multiples July 2015

I am not knowledgeable enough to speculate on who may be the most likely next couplings (for what its worth, regular readers will know I think Lancashire will be a target at some stage). This article outlines who Eamonn Flanagan at Shore Capital thinks is next, with Amlin being his top pick. What is clear is that the valuation of many players is primarily based upon their M&A potential rather than the underlying operating results given pricing in the market. Reinsurance pricing seems to have stabilised although I suspect policy terms & conditions remains an area of concern. On the commercial insurance side, reports from market participants like Lockton (see here) and Towers Watson (see graph below) show an ever competitive market.

click to enlargeCommercial Lines Insurance Pricing Survey Towers Watson Q1 2015

Experience has thought me that pricing is the key to future results for insurers and, although the market is much more disciplined than the late 1990s, I think many will be lucky to produce double-digit ROEs in the near term on an accident year basis (beware those dipping too much into the reserve pot!).

I am also nervous about the amount of unrealised gains which are inflating book values that may reverse when interest rates rise. For example, unrealised gains make up 8%, 13% and 18% of the Hartford, Zurich, and Swiss Re’s book value respectively as at Q1. So investing primarily to pick up an M&A premium seems like a mugs game to me in the current market.

M&A obviously brings considerable execution risk which may result in one plus one not equalling two. Accepting that the financial crisis hit the big guys like AIG and Hartford pretty hard, the graph below suggests that being too big may not be beautiful where average ROE (and by extension, market valuation) is the metric for beauty.

click to enlargeIs big beautiful in insurance

In fact, the graph above suggests that the $15-$25 billion range in terms of premiums may be the sweet spot for ROE. Staying as a specialist in the $2-7 billion premium range may have worked in the past but, I suspect, will be harder to replicate in the future.

Exabyte Zenith

There is a sense of déjà vu when you read about the competing plans of Greg Wyler’s OneWeb and Elon Musk’s SpaceX to build a network of low earth orbit satellites to provide cheap broadband across the globe over the next few years. Memories of past failures from the late 1990s telecom bubble come to mind with these network plans. Names like Iridium, GlobalStar, Teledesic, and SkyBridge. Maybe, this time, the dreamers with access to billions can get it right!

You never know, there may even be a comeback for broadband over power-lines (not likely according to this article)!

I did come across the latest figures from Cisco in their “ The Zettabyte Era – Trends and Analysis” piece, as previously referenced in this post. As a reminder, gigabyte/terabyte/petabyte/exabyte/zettabyte/yottabyte is a kilobyte to the power of 3, 4, 5, 6, 7 and 8 respectively. Cisco continues to predict a tripling of global IP traffic from 2014 to 2019. The graphics below give some colour on the detail behind the predictions.

Split by consumer and business traffic with each further split by traffic type. Unsurprisingly consumer video traffic is dominating the consumer 24% CAGR.

click to enlargeGlobal IP Traffic 2015 projections

Growth in the US, Asia and Europe is driving the impressive 29% metro CAGR whilst Asia Pacific traffic is the prime driver for long-haul growth.

click to enlargeGlobal IP Metro LongHaul Traffic 2015 projections

The split by region shows the status quo will be maintained in terms of traffic breakdown with Central/Eastern Europe and the Middle East /Africa regions projected to have growth rates of 30%+ and 40%+ respectively as opposed to approx 20% in the main markets.

click to enlargeGlobal IP Traffic Geographical Split

Exabytes are reaching their zenith and by next year global IP traffic is predicted to exceed a zettabyte.