Tag Archives: uncertainty

Anarchy in the UK

Uncertainty reins and the economic impacts of Brexit on the UK and on Europe have yet to become clear. And a big factor in the uncertainty is the political path to Brexit. The UK political class are now trying to rally around newly agreed leadership of their respective parties (assuming Labour MPs eventually manage to get rid of their current leader) and craft policies on how to engage in the divorce negotiations.

A unique political feature of the UK is their first past the post (FPTP) electoral system. The graph below of the 2015 general election shows how the system favours the larger political parties. It also shows how parliamentary representation under FPTP can be perverse. The Scottish SNP, for example, got 4.8% of the vote but 8.6% of the members of parliament (MPs). The right wing little Englander party UKIP, whose rise in popularity was a direct cause of the decision to have a referendum on Brexit, got 12.6% of the vote but just 0.26% of the MPs. Despite its obvious failings, the British are fond of their antiquated FPTP system and voted to retain it by 68% in a 2011 referendum (albeit with a low voter turnout at 42%).

click to enlarge2015 UK General Election Results

One lasting impact of the Brexit vote is likely to be on the make-up of British politics. Much has been commented on the generational, educational and geographical disparities in the Brexit vote. A breakdown of the leave-remain vote by the political parties, as per the graph below, shows how the issue of the EU has caused schisms within the largest two parties. Such schisms are major contributors to the uncertainty on how the Brexit divorce settlement will go.

click to enlargeUK Brexit Vote Breakdown by Political Party

Currently both sides, the UK and the EU, have taken hard positions with Conservative politicians saying restrictions on the freedom of labour movement is a red line issue and the EU demanding that Article 50 is triggered and the UK agree the divorce terms before the future relationship can be discussed.

Let’s assume that all of the different arrangements touted in the media since the vote boil down to two basic options. The first involves access to EU markets through the European Economic Area (EEA) or the European Free Trade Association in exchange for some form of free movement of labour, commonly referred to as the Norway or the Switzerland options. The second option is a bilateral trade agreement with a skills based immigration policy, commonly referred to as the Canadian option (although it’s interesting to see that there is political uncertainty in Europe over how the Canadian trade deal, which has been agreed in principle, will be ratified). I have called these option 1 and option 2 respectively (commonly referred to as soft and hard Brexit respectively).

Let’s assume the negotiations on Brexit in the near future will be conducted in a sensible, rather than an emotive, manner whereby the economic impacts have been shown to be detrimental albeit not life threatening. And both sides come to realise that extreme positions are not in their interest and a workable compromise is what everybody wants. In such a scenario, I have further assumed that the vast majority (e.g. 98%) of remain voters would favour option 1 and I have judgmentally assigned political preferences for each option by political party (e.g. 90% and 75% of Conservative and Labour leave voters prefer option 2 respectively). Based upon these estimates, I calculate that there would be a 56% majority of the UK electorate in favour of option 1, as per the graph below.

click to enlargeBrexit Options Breakdown by Political Party

Now, the above thought experience makes a lot of assumptions, most of which are likely to be well off the reality. Particularly, I suspect the lack of emotive and divisive negotiations is an assumption too far.

What the heck, let’s go one step further in these fanciful thoughts. Let’s assume the new leadership in the Conservative party adopt option 2 as their official policy. Let’s also assume that the Labour party splits into old labour, a left wing anti-globalisation party, and a new centre left party whose official policy is option 1. In a theoretical general election (which may be required to approve any negotiated deal), I guesstimate the result below under the unpredictable FPTP system.

click to enlargeTheoretical post Brexit General Election Result

This analysis suggests a majority government of 52% of MPs with option 1 as their policy could be possible with a grand coalition of the new centre party (Labour break away party), the Liberal Democrats and the SNP. The Conservatives and UKIP could, in this scenario, only manage 35% between them (the old labour party at 9% of MPs wouldn’t tolerate to join such a combination no matter what their views on the EU). The net result would be a dramatic shift in UK politics with Europe as a defining issue for the future.

Yea, right!

Back to today’s mucky and uncertain reality….

 

Follow-up: I thought I was been clever with the title of this post and I only realised after posting it that the Economist used it in their title this week! Is there nothing original any more….

A thoroughly modern intellect

In only the way he could, one of Oscar Wilde’s quips highlights the futility in trying to look at future risks when he said “to expect the unexpected shows a thoroughly modern intellect“.

In a recent article from Lucy Marcus called “Preparing for the unknown unknowns”, the author stated the following:

“Moreover, for all the risks that we can and do plan for, it is those for which we cannot prepare that can do the most damage. That is why, alongside all of the identifying, quantifying, and mitigating, we must also embrace the idea that not all change can be clearly foreseen and planned for.”

Notwithstanding the wisdom of these words, it is always interesting to see the results of the Global Risks report published each year by the World Economic Forum prior to the annual Davos meeting. The 2015 report is based upon the results of a survey from nearly 900 experts and the graphs below show the resulting likelihood and potential impact of 28 global risks and the interconnections between these risks.

click to enlargeGlobal Risks 2015

click to enlargeGlobal Risks 2015 Interconnections

Perhaps the most interesting part of the report is the schematic, reproduced below, showing how the top 5 risks in terms of impact and in terms of likelihood have changed from 2007 through to this year’s report.

click to enlargeGlobal Risks 2007 to 2015

The changing colours across the years illustrate now fickle and influenced by recent experiences our concerns for the future can be and just how thoroughly modern our intellect is.

Given the name of this blog, I of course include myself in the previous sentence also.

IOSCO Report on Corporate Bonds

Staff from IOSCO issued a report in April on the global corporate bond market. Although there was nothing earth shattering in the report, there was some interesting insights. The report highlighted 4 themes as below:

  1. Corporate bond markets have become bigger, more important for the real economy, and increasingly global in nature.
  2. Corporate bond markets have begun to fill an emerging gap in bank lending and long-term financing and are showing potential for servicing SME financing needs.
  3. A search for yield is driving investment in corporate bond markets. A changing interest rate environment will create winners and losers.
  4. Secondary markets are also transforming to adapt to a new economic and regulatory environment. Understanding the nature and reasons for this transformation is key in identifying future potential systemic risk issues and opportunities for market development.

The report also highlights the uncertainty that remains on secondary markets in the event of a interest rate shock and the $11 trillion worth of corporate debt (out of $50 trillion) due to mature in the next seven years.

Some interesting graphs in the report include the one below on the different characteristics of issuances pre- and post- 2007.

click to enlargeIOSCO Pre2007 and Post2007 Corporate Bond Issuance April 2014

Other interesting graphs highlight how corporate bonds are taking up the stagnation in bank credit in the US and the EU, and also highlight the boom in bank credit in China, as below.

click to enlargeIOSCO Bank Credit and Corporate Bond Markets April 2014

And finally the graphs below show the increase in non-financial corporate bond issuance and the modest growth in high yield issuance.

click to enlargeIOSCO Corporate Bond Markets April 2014

 

Confounding correlation

Nassim Nicholas Taleb, the dark knight or rather the black swan himself, said that “anything that relies on correlation is charlatanism”.  I am currently reading the excellent “The signal and the noise” by Nate Silver. In Chapter 1 of the book he has a nice piece on CDOs as an example of a “catastrophic failure of prediction” where he points to certain CDO AAA tranches which were rated on an assumption of a 0.12% default rate and which eventually resulted in an actual rate of 28%, an error factor of over 200 times!.

Silver cites a simplified CDO example of 5 risks used by his friend Anil Kashyap in the University of Chicago to demonstrate the difference in default rate if the 5 risks are assumed to be totally independent and dependent.  It got me thinking as to how such a simplified example could illustrate the impact of applied correlation assumptions. Correlation between core variables are critical to many financial models and are commonly used in most credit models and will be a core feature in insurance internal models (which under Solvency II will be used to calculate a firms own regulatory solvency requirements).

So I set up a simple model (all of my models are generally so) of 5 risks and looked at the impact of varying correlation from 100% to 0% (i.e. totally dependent to independent) between each risk. The model assumes a 20% probability of default for each risk and the results, based upon 250,000 simulations, are presented in the graph below. What it does show is that even at a high level of correlation (e.g. 90%) the impact is considerable.

click to enlarge5 risk pool with correlations from 100% to 0%

The graph below shows the default probabilities as a percentage of the totally dependent levels (i.e 20% for each of the 5 risks). In effect it shows the level of diversification that will result from varying correlation from 0% to 100%. It underlines how misestimating correlation can confound model results.

click to enlargeDefault probabilities & correlations

The imperfect art of climate change modelling

The completed Group I report from the 5th Intergovernmental Panel on Climate Change (IPCC) assessment was published in January (see previous post on summary report in September). One of the few definite statements made in the report was that “global mean temperatures will continue to rise over the 21st century if greenhouse gas (GHG) emissions continue unabat­ed”. How we measure the impact of such changes is therefore incredibly important. A recent article in the FT by Robin Harding on the topic which highlighted the shortcomings of models used to assess the impact of climate change therefore caught my attention.

The article referred to two academic papers, one by Robert Pindyck and another by Nicholas Stern, which contained damning criticism of models that integrate climate and economic models, so called integrated assessment models (IAM).

Pindyck states that “IAM based analyses of climate policy create a perception of knowledge and precision, but that perception is illusory and misleading”. Stern also criticizes IAMs stating that “assumptions built into the economic modelling on growth, damages and risks, come close to assuming directly that the impacts and costs will be modest and close to excluding the possibility of catastrophic outcomes”.

These comments remind me of Paul Wilmott, the influential English quant, who included in his Modeller’s Hippocratic Oath the following: “I will remember that I didn’t make the world, and it doesn’t satisfy my equations” (see Quotes section of this website for more quotes on models).

In his paper, Pindyck characterised the IAMs currently used into 6 core components as the graphic below illustrates.

click to enlargeIntegrated Assessment Models

Pindyck highlights a number of the main elements of IAMs which involve a considerable amount of arbitrary choice, including climate sensitivity, the damage and social welfare (utility) functions. He cites important feedback loops in climates as difficult, if not impossible, to determine. Although there has been some good work in specific areas like agriculture, Pindyck is particularly critical on the damage functions, saying many are essentially made up. The final piece on social utility and the rate of time preference are essentially policy parameter which are open to political forces and therefore subject to considerable variability (& that’s a polite way of putting it).

The point about damage functions is an interesting one as these are also key determinants in the catastrophe vendor models widely used in the insurance sector. As a previous post on Florida highlighted, even these specific and commercially developed models result in varying outputs.

One example of IAMs directly influencing current policymakers is those used by the Interagency Working Group (IWG) which under the Obama administration is the entity that determines the social cost of carbon (SCC), defined as the net present damage done by emitting a marginal ton of CO2 equivalent (CO2e), used in regulating industries such as the petrochemical sector. Many IAMs are available (the sector even has its own journal – The Integrated Assessment Journal!) and the IWG relies on three of the oldest and most well know; the Dynamic Integrated Climate and Economy (DICE) model, the Policy Analysis of the Greenhouse Effect (PAGE) model, and the fun sounding Climate Framework for Uncertainty, Negotiation, and Distribution (FUND) model.

The first IWG paper in 2010 included an exhibit, reproduced below, summarizing the economic impact of raising temperatures based upon the 3 models.

click to enlargeClimate Change & Impact on GDP IWG SCC 2010

To be fair to the IWG, they do highlight that “underlying the three IAMs selected for this exercise are a number of simplifying assumptions and judgments reflecting the various modelers’ best attempts to synthesize the available scientific and economic research characterizing these relationships”.

The IWG released an updated paper in 2013 whereby revised SCC estimates were presented based upon a number of amendments to the underlying models. Included in these changes are revisions to damage functions and to climate sensitivity assumptions. The results of the changes on average and 95th percentile SCC estimates, at varying discount rates (which are obviously key determinants to the SCC given the long term nature of the impacts), can be clearly seen in the graph below.

click to enlargeSocial Cost of Carbon IWG 2010 vrs 2013

Given the magnitude of the SCC changes, it is not surprising that critics of the charges, including vested interests such as petrochemical lobbyists, are highlighting the uncertainty in IAMs as a counter against the charges. The climate change deniers love any opportunity to discredit the science as they demonstrated so ably with the 4th IPCC assessment. The goal has to be to improve modelling as a risk management tool that results in sensible preventative measures. Pindyck emphasises that his criticisms should not be an excuse for inaction. He believes we should follow a risk management approach focused on the risk of catastrophe with models updated as more information emerges and uses the threat of nuclear oblivion during the Cold War as a parallel. He argues that “one can think of a GHG abatement policy as a form of insurance: society would be paying for a guarantee that a low-probability catastrophe will not occur (or is less likely)”. Stern too advises that our focus should be on potential extreme damage and that the economic community need to refocus and combine current insights where “an examination and modelling of ways in which disruption and decline can occur”.

Whilst I was looking into this subject, I took the time to look over the completed 5th assessment report from the IPCC. First, it is important to stress that the IPCC acknowledge the array of uncertainties in predicting climate change. They state the obvious in that “the nonlinear and chaotic nature of the climate system imposes natu­ral limits on the extent to which skilful predictions of climate statistics may be made”. They assert that the use of multiple scenarios and models is the best way we have for determining “a wide range of possible future evolutions of the Earth’s climate”. They also accept that “predicting socioeconomic development is arguably even more difficult than predicting the evolution of a physical system”.

The report uses a variety of terms in its findings which I summarised in a previous post and reproduce below.

click to enlargeIPCC uncertainty

Under the medium term prediction section (Chapter 11) which covers the period 2016 to 2035 relative to the reference period 1986 to 2005, a number of the notable predictions include:

  • The projected change in global mean surface air temperature will likely be in the range 0.3 to 0.7°C (medium confidence).
  • It is more likely than not that the mean global mean surface air temperature for the period 2016–2035 will be more than 1°C above the mean for 1850–1900, and very unlikely that it will be more than 1.5°C above the 1850–1900 mean (medium confidence).
  • Zonal mean precipitation will very likely increase in high and some of the mid-latitudes, and will more likely than not decrease in the subtropics. The frequency and intensity of heavy precipitation events over land will likely increase on average in the near term (this trend will not be apparent in all regions).
  • It is very likely that globally averaged surface and vertically averaged ocean temperatures will increase in the near term. It is likely that there will be increases in salinity in the tropical and (especially) subtropical Atlantic, and decreases in the western tropical Pacific over the next few decades.
  • In most land regions the frequency of warm days and warm nights will likely increase in the next decades, while that of cold days and cold nights will decrease.
  • There is low confidence in basin-scale projections of changes in the intensity and frequency of tropical cyclones (TCs) in all basins to the mid-21st century and there is low confidence in near-term projections for increased TC intensity in the North Atlantic.

The last bullet point is especially interesting for the insurance sector involved in providing property catastrophe protection. Graphically I have reproduced two interesting projections below (Note: no volcano activity is assumed).

click to enlargeIPCC temperature near term projections

Under the longer term projections in Chapter 12, the IPCC makes the definite statement that opened this post. It also states that it is virtually certain that, in most places, there will be more hot and fewer cold temperature extremes as global mean temper­atures increase and that, in the long term, global precipitation will increase with increased global mean surface temperature.

I don’t know about you but it seems to me a sensible course of action that we should be taking scenarios that the IPCC is predicting with virtual certainty and applying a risk management approach to how we can prepare for or counteract extremes as recommended by experts such as Pindyck and Stern.

The quote “it’s better to do something imperfectly than to do nothing perfectly” comes to mind. In this regard, for the sake of our children at the very least, we should embrace the imperfect art of climate change modelling and figure out how best to use them in getting things done.