Confounding correlation

Nassim Nicholas Taleb, the dark knight or rather the black swan himself, said that “anything that relies on correlation is charlatanism”.  I am currently reading the excellent “The signal and the noise” by Nate Silver. In Chapter 1 of the book he has a nice piece on CDOs as an example of a “catastrophic failure of prediction” where he points to certain CDO AAA tranches which were rated on an assumption of a 0.12% default rate and which eventually resulted in an actual rate of 28%, an error factor of over 200 times!.

Silver cites a simplified CDO example of 5 risks used by his friend Anil Kashyap in the University of Chicago to demonstrate the difference in default rate if the 5 risks are assumed to be totally independent and dependent.  It got me thinking as to how such a simplified example could illustrate the impact of applied correlation assumptions. Correlation between core variables are critical to many financial models and are commonly used in most credit models and will be a core feature in insurance internal models (which under Solvency II will be used to calculate a firms own regulatory solvency requirements).

So I set up a simple model (all of my models are generally so) of 5 risks and looked at the impact of varying correlation from 100% to 0% (i.e. totally dependent to independent) between each risk. The model assumes a 20% probability of default for each risk and the results, based upon 250,000 simulations, are presented in the graph below. What it does show is that even at a high level of correlation (e.g. 90%) the impact is considerable.

click to enlarge5 risk pool with correlations from 100% to 0%

The graph below shows the default probabilities as a percentage of the totally dependent levels (i.e 20% for each of the 5 risks). In effect it shows the level of diversification that will result from varying correlation from 0% to 100%. It underlines how misestimating correlation can confound model results.

click to enlargeDefault probabilities & correlations

CaT pricing “heading for the basement”

Edward Noonan of Validus is always good copy and the Q1 conference call for Validus provided some insight into the market ahead of the important July 1 renewals. When asked by an analyst whether the catastrophe market was reaching a floor, Noonan answered that “I’m starting to think we might be heading for the basement”.

He also said “I think the truly disruptive factor in the market right now is ILS money. I made a comment that we’ve always viewed the ILS manager business behaving rationally. I can’t honestly say that (anymore with) what we’re seeing in Florida right now. I mean we have large ILS managers who are simply saying – whatever they quote we will put out a multi-hundred million dollar line at 10% less.

I have posted many times on the impact of new capital in the ILS market, more recently on the assertion that ILS funds havw a lower cost of capital. Noonan now questions whether investors in the ILS space really understand the expected loss cost as well as experienced traditional players. Getting a yield of 5% or lower now compared to 9% a few short years ago for BBB – risks is highlighted as an indication that investors lack a basic understanding of what they are buying. The growing trend of including terrorism risks in catastrophe programmes is also highlighted as a sign that the new market players are mispricing risk and lack basic understanding on issues such as a potential clash in loss definitions and wordings.

Validus highlight how they are disciplined in not renewing underpriced risk and arbitraging the market by purchasing large amounts of collaterised reinsurance and retrocession. They point to the reduction in their net risk profile by way of their declining PMLs, as the graph below of their net US wind PMLs as a percentage of net tangible assets illustrates.

This is positive provided the margins on their core portfolio don’t decrease faster than the arbitrage. For example, Validus made underwriting income in 2012 and 2013 of 6% and 17% of their respective year-end net tangible assets. The graph below also shows what the US Wind PML would be reduced by if an operating profit of 12% (my approximation of a significant loss free 2014 for Validus) could be used to offset the US Wind net losses. Continuing pricing reductions in the market could easily make a 12% operating profit look fanciful.

click to enlargeValidus Net US Wind PML as % of tangible net assets

I think that firms such as Validus are playing this market rationally and in the only way you can without withdrawing from core (albeit increasingly under-priced) markets. If risk is continually under-priced over the next 12 to 24 months, questions arise about the sustainability of many existing business models. You can outrun a train moving out of a station but eventually you run out of platform!

COLT calls time

COLT announced plans this week to cut €175 million of low margin voice wholesale business and take a €30 million restructuring charge in an attempt to address declining margins and halt operating cash burn, issues which I highlighted in a previous post. The stock took a hit and is down about 10% on the month. Press reports, like this FT article and this Guardian article, speculate that majority shareholder Fidelity is losing patience and the business is effectively for sale. Robert Powell at Telecom Ramblings is also speculating on potential buyers.

The graph below shows my rough estimates of the revenue and EBITDA margin (excluding restructuring charges) for 2014 and 2015 based upon COLT’s guidance (2015 is purely based upon my guestimates). The execution risk in the restructuring based upon the firm’s recent history doesn’t match up against any potential M&A upside in my opinion. This one is best to watch from the side-line. It should be interesting.

click to enlargeCOLT Telecom 2006 to 2013 Revenue & EBITDA Margin 2014 & 2015 forecast

IPCC Risk & Uncertainty

I haven’t had time to go into the latest WGIII IPCC report in detail (indeed I haven’t had much time recently to spend on blogging) but I had a quick browse through the report and there is an excellent chapter on “Integrated Risk and Uncertainty Assessment of Climate Change Response Policies” which goes through many of the key elements of current risk management theory and practise and how they can be applied to climate change.

A previous post highlighted the difficulties of making predictions given the uncertainties involved. The report highlights the “large number of uncertainties in scientific understanding of the physical sensitivity of the climate to the build‐up of GHGs” and acknowledges that these “physical uncertainties are multiplied by the many socioeconomic uncertainties that affect how societies would respond to emission control policies”. The report calls these socioeconomic uncertainties “profound” and lists examples as the development and deployment of technologies, prices for major primary energy sources, average rates of economic growth and the distribution of benefits and costs within societies, emission patterns, and a wide array of institutional factors such as whether and how countries cooperate effectively at the international level.

The IPCC gives a medium rating (50% probability) to the statement that the “current trajectory of global annual and cumulative emissions of GHGs is inconsistent with widely discussed goals of limiting global warming at 1.5 to 2 degrees Celsius above the preindustrial level”.

Included in Chapter 2 are the graphs below. According to the report “the representative concentration pathways (RCPs) are constructed by the IPCC on the bases of plausible storylines while insuring (1) they are based on a representative set of peer reviewed scientific publications by independent groups, (2) they provide climate and atmospheric models as inputs, (3) they are harmonized to agree on a common base year, and (4) they extend to the year 2100”. The 3 scenarios (A2, A1B and B1) are multi-model global averages of surface warming (relative to 1980–1999) shown as continuations of the 20th century simulations. Shading is the plus/minus one standard deviation range of individual model annual averages and the orange line is where concentrations were held constant at year 2000 values. Time permitting; it demonstrates that the conclusions and scenarios presented in the latest report are worth finding out more about.

click to enlargeIPCC global surface temperature scenarios from RCPs

Although each scenario is likely at the mercy of the uncertainties highlighted above, the open and thoughtful way the report is presented, including highlighting the underlying weaknesses, doesn’t mean that they (or the report) can be ignored. Indeed, the recent output from the IPCC will hopefully provide the basis for informed thinking on the subject in the coming years.

The report includes a reference to Kahneman-Tversky’s certainty effect where people overweight outcomes they consider certain, relative to outcomes that are merely probable. That implies that a 50% probability of the temperature blowing through 2 degrees celsius may not be enough to force real action. Unfortunately the underlying scientific and socioeconomic uncertainties inherent in making forecasts on temperature change over the next 30 to 50 years may mean that the required level of certainty cannot ever be achieved (until of course it’s too late).