Category Archives: Insurance Market

Befuddled Lloyd’s

Lloyd’s of London always provides a fascinating insight into the London insurance market and beyond into the global specialty insurance market, as this previous post shows. It’s Chairman, Bruce Carnegie-Brown, commented in their 2017 annual report that he expects “2018 to be another challenging year for Lloyd’s and the Corporation continues to refine its strategy to address evolving market conditions”. Given the bulking up of many of its competitors through M&A, Willis recently called it a reinvigoration of the “big balance sheet” reinsurance model, Lloyd’s needs to get busy sharpening its competitive edge. In a blunter message Brown stressed that “the market’s 2017 results are proof, if any were needed, that business as usual is not sustainable”.

A looked at the past 15 years of underwriting results gives an indicator of current market trends since the underwriting quality control unit, called the Franchise Board, was introduced at the end of 2002 after the disastrous 1990’s for the 330-year-old institution.

click to enlarge

The trend of increasing non-CAT loss ratios after years of soft pricing coupled with declining prior year reserve releases is clear to see. That increases the pressure on the insurance sector to control expenses. To that end, Inga Beale, Lloyd’s CEO, is pushing modernisation via the London Market Target Operating Model programme hard, stating that electronic placement will be mandated, on a phased basis, “to speed up the adoption of the market’s modernisation programme, which will digitise processes, reduce unsustainable expense ratios, and make Lloyd’s more attractive to do business with”.

The need to reduce expenses in Lloyd’s is acute given its expense ratio is around 40% compared to around 30% for most of its competitors. Management at Lloyd’s promised to “make it cheaper and easier to write business at Lloyd’s, enabling profitable growth”. Although Lloyd’s has doubled its gross premium volumes over the past 15 years, the results over varying timeframes below, particularly the reducing underwriting margins, show the importance of stressing profitable growth and expense efficiencies for the future.

click to enlarge

A peer comparison of Lloyd’s results over the past 15 years illustrates further the need for the market to modernise, as below. Although the 2017 combined ratio for some of the peer groupings have yet to finalised and published (I will update the graph when they do so), the comparison indicates that Lloyd’s has been doing worse than its reinsurance and Bermudian peers in recent years. It is suspicious to see, along with the big reinsurers and Bermudians, Lloyd’s included Allianz, CNA, and Zurich (and excluded Mapfe) in their competitor group from 2017. If you can’t meet your target, just change the metric behind the target!

click to enlarge

A recent report from Aon Benfield shows the breakdown of the combined ratio for their peer portfolio of specialist insurers and reinsurers from 2006 to 2017, as below.

click to enlargeAon Benfield Aggregate Combined Ratio 2006 to 2017

So, besides strong competitors, increasing loss ratios and heavy expense loads, what does Lloyd’s have to worry about? Well, in common with many, Lloyd’s must contend with structural changes across the industry as a result of, in what Willis calls in their latest report, “the oversupply of capital” from investors in insurance linked securities (ILS) with a lower cost of capital, whereby the 2017 insured losses appears to have had “no impact upon appetite”, according to Willis.

I have posted many times, most recently here, on the impact ILS has had on property catastrophe pricing. The graph of the average multiple of coupon to expected loss on deals monitored by sector expert Artemis again illustrates the pricing trend. I have come up with another angle to tell the story, as per the graph below. I compared the Guy Carpenter rate on line (ROL) index for each year against an index of the annual change in the rolling 10-year average global catastrophe insured loss (which now stands at $66 billion for 2008-2017). Although it is somewhat unfair to compare a relative measure (the GC ROL index) against an absolute measure (change in average insured loss), it makes a point about the downward trend in property catastrophe reinsurance pricing in recent years, particularly when compared to the trend in catastrophic losses. To add potentially to the unfairness, I also included the rising volumes in the ILS sector, in an unsubtle finger point.

click to enlarge

Hilary Weaver, Lloyd’s CRO, recognises the danger and recently commented that “the new UK ILS regulation will, if anything, increase the already abundant supply of insurance capital” and “this is likely to mean that prices remain low for many risks, so we need to remain vigilant to ensure that the prices charged for them are proportionate to the risk”.

The impact extends beyond soft pricing and could impact Lloyd’s risk profile. The loss of high margin (albeit not as high as it once was) and low frequency/high severity business means that Lloyd’s will have to fish in an already crowded pond for less profitable and less volatile business. The combined ratios of Lloyd’s main business lines are shown below illustrating that all, except casualty, have had a rough 2017 amid competitive pressures and large losses.

As reinsurance business is commoditised further by ILS, in a prelude to an increase in machine/algorithm underwriting, Lloyd’s business will become less volatile and as a result less profitable. To illustrate, the lower graph below shows Lloyd’s historical weighted average combined ratio, using the 2017 business mix, versus the weighted average combined ratio excluding the reinsurance line. For 2003 to 2017, the result would be an increase in average combined ratio, from 95.8% to 96.5%, and a reduction in volatility, the standard deviation from 9.7% to 7%.

click to enlarge

To write off Lloyd’s however would be a big mistake. In my view, there remains an important role for a specialist marketplace for heterogeneous risks, where diverse underwriting expertise cannot be easily replicated by machines. Lloyd’s has shown its ability in the past to evolve and adapt, unfortunately however usually when it doesn’t have any choice. Hopefully, this legendary 330-year-old institution will get ahead of the game and dictate its own future. It will be interesting to watch.

 

Epilogue – Although this analogy has limitations, it occurs to me that the insurance sector is at a stage of evolution that the betting sector was at about a decade ago (my latest post on the sector is here). Traditional insurers, with over-sized expenses, operate like old traditional betting shops with paper slips and manual operations. The onset of online betting fundamentally changed the way business is transacted and, as a result, the structure of the industry. The upcoming digitalisation of the traditional insurance business will radically change the cost structure of the industry. Lloyd’s should look to the example of Betfair (see an old post on Betfair for more) as a means of digitalising the market platform and radically reducing costs.

Follow-on 28th April – Many thanks to Adam at InsuranceLinked for re-posting this post. A big welcome to new readers, I hope you will stick around and check out some other posts from this blog. I just came across this report from Oliver Wyam on the underwriter of the future that’s worth a read. They state that the “commercial and wholesale insurance marketplaces are undergoing radical change” and they “expect that today’s low-price environment will continue for the foreseeable future, continuing to put major pressure on cost“.

Artificial Insurance

The digital transformation of existing business models is a theme of our age. Robotic process automation (RPA) is one of the many acronyms to have found its way into the terminology of businesses today. I highlighted the potential for telecoms to digitalise their business models in this post. Klaus Schwab of the World Economic Forum in his book “Fourth Industrial Revolution” refers to the current era as one whereby “new technologies that are fusing the physical, digital and biological worlds, impacting all disciplines, economies and industries, and even challenging ideas about what it means to be human”.

The financial services business is one that is regularly touted as been rife for transformation with fintech being the much-hyped buzz word. I last posted here and here on fintech and insurtech, the use of technology innovations designed to squeeze out savings and efficiency from existing insurance business models.

Artificial intelligence (AI) is used as an umbrella term for everything from process automation, to robotics and to machine learning. As referred to in this post on equity markets, the Financial Stability Board (FSB) released a report called “Artificial Intelligence and Machine Learning in Financial Services” in November 2017. In relation to insurance, the FSB report highlights that “some insurance companies are actively using machine learning to improve the pricing or marketing of insurance products by incorporating real-time, highly granular data, such as online shopping behaviour or telemetrics (sensors in connected devices, such as car odometers)”. Other areas highlighted include machine learning techniques in claims processing and the preventative benefits of remote sensors connected through the internet of things. Consultants are falling over themselves to get on the bandwagon as reports from the likes of Deloitte, EY, PwC, Capgemini, and Accenture illustrate.

One of the better recent reports on the topic is this one from the reinsurer SCOR. CEO Denis Kessler states that “information is becoming a commodity, and AI will enable us to process all of it” and that “AI and data will take us into a world of ex-ante predictability and ex-post monitoring, which will change the way risks are observed, carried, realized and settled”. Kessler believes that AI will impact the insurance sector in 3 ways:

  • Reducing information asymmetry and bringing comprehensive and dynamic observability in the insurance transaction,
  • Improving efficiencies and insurance product innovation, and
  • Creating new “intrinsic“ AI risks.

I found one article in the SCOR report by Nicolas Miailhe of the Future Society at the Harvard Kennedy School particularly interesting. Whilst talking about the overall AI market, Miailhe states that “the general consensus remains that the market is on the brink of a revolution, which will be characterized by an asymmetric global oligopoly” and the “market is qualified as oligopolistic because of the association between the scale effects and network effects which drive concentration”.  When referring to an oligopoly, Miailhe highlights two global blocks – GAFA (Google/Apple/Facebook/Amazon) and BATX (Baidu/Alibaba/Tencent/Xiaomi). In the insurance context, Miailhe states that “more often than not, this will mean that the insured must relinquish control, and at times, the ownership of data” and that “the delivery of these new services will intrude heavily on privacy”.

At a more mundane level, Miailhe highlights the difficulty for stakeholders such as auditors and regulators to understand the business models of the future which “delegate the risk-profiling process to computer systems that run software based on “black box” algorithms”. Miailhe also cautions that bias can infiltrate algorithms as “algorithms are written by people, and machine-learning algorithms adjust what they do according to people’s behaviour”.

In a statement that seems particularly relevant today in terms of the current issue around Facebook and data privacy, Miailhe warns that “the issues of auditability, certification and tension between transparency and competitive dynamics are becoming apparent and will play a key role in facilitating or hindering the dissemination of AI systems”.

Now, that’s not something you’ll hear from the usual cheer leaders.

Insurance M&A Pickup

It’s been a while since I posted on the specialty insurance sector and I hope to post some more detailed thoughts and analysis when I get the time in the coming months. M&A activity has picked up recently with the XL/AXA and AIG/Validus deals being the latest examples of big insurers bulking up through M&A. Deloitte has an interesting report out on some of the factors behind the increased activity. The graph below shows the trend of the average price to book M&A multiples for P&C insurers.

click to enlarge

As regular readers will know, my preferred metric is price to tangible book value and the exhibit below shows that the multiples on recent deals are increasing and well above the standard multiple around 1.5X. That said, the prices are not as high as the silly prices of above 2X paid by Japanese insurers in 2015. Not yet anyway!

click to enlarge

Unless there are major synergies, either on the operating side or on the capital side (which seems to be AXA’s justification for the near 2X multiple on the XL deal), I just can’t see how a 2X multiple is justified in a mature sector. Assuming these firms can earn a 10% return on tangible assets over multiple cycles, a 2X multiple equates to 20X earnings!

Time will tell who the next M&A target will be….

Cloudfall

More and more business is moving to the cloud and, given the concentration of providers and their interlinkages, it’s creating security challenges. In the US, 15 cloud providers account for 70% of the market.

The National Institute of Standards and Technology (NIST) describes the cloud as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.

 A cloud solution is typically architected with multiple regions, where a region is a geographical location where users can run their resources, and is typically made up of multiple zones. All major cloud providers have multiple regions, located across the globe and within the US. For example, Rackspace has the fewest number of regions at 7 whereas Microsoft Azure has the most at 36.

The industry is projected to grow at a compound annual growth rate of 36% between 2014 and 2026, as per the graph below. Software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) are the types of cloud services sold.

click to enlarge

Control of the underlying cloud infrastructure of networks, servers, operating systems, and storage is the responsibility of the cloud provider, with the user having control over the deployed applications and possibly configuration settings for the application-hosting environment.

Amazingly however, the main responsibility for protecting corporate data in the cloud lies not with the cloud provider but with the cloud customer, unless specifically agreed otherwise. Jay Heiser of Gartner commented that “we are in a cloud security transition period in which focus is shifting from the provider to the customer” and businesses “are learning that huge amounts of time spent trying to figure out if any particular cloud service provider is secure or not has virtually no payback”.

An organisation called the Cloud Security Alliance (CSA) issued its report on the security threats to the cloud.  These include the usual threats such as data breaches, denial of service (DoS), advanced persistent threats (APTs) and malicious insiders. For the cloud, add in threats including insufficient access management, insecure user interfaces (UIs) and application programming interfaces (APIs), and shared technology vulnerabilities.

Cyber security is an important issue today and many businesses, particularly larger business are turning to insurance to mitigate the risks to their organisations, as the graph below on cyber insurance take-up rates shows.

click to enlarge

Lloyds of London recently released an interesting report called Cloud Down that estimated the e-business interruption costs in the US arising from the sustained loss of access to a cloud service provider. The report estimates, using a standard catastrophic modelling framework from AIR, a cyber incident that takes a top 3 cloud provider offline in the US for 3-6 days would result in ground-up loss central estimates between $7-15 billion and insured losses between $1.5-3 billion. By necessity, the assumptions used in the analysis are fairly crude and basic.

Given the number of bad actors in the cyber world, particularly those who may intend to cause maximum disruption, security failings around the cloud could, in my view, result in losses of many multiples of those projected by Lloyds if several cloud providers are taken down for longer periods. And that’s scary.

Beautiful Models

It has been a while since I posted on dear old Solvency II (here). As highlighted in the previous post on potential losses, the insurance sector is perceived as having robust capital levels that mitigates against the current pricing and investment return headwinds. It is therefore interesting to look at some of detail emerging from the new Solvency II framework in Europe, including the mandatory disclosures in the new Solvency and Financial Condition Report (SFCR).

The June 2017 Financial Stability report from EIOPA, the European insurance regulatory, contains some interesting aggregate data from across the European insurance sector. The graph below shows solvency capital requirement (SCR) ratios, primarily driven by the standard formula, averaging consistently around 200% for non-life, life and composite insurers. The ratio is the regulatory capital requirement, as calculated by a mandated standard formula or a firm’s own internal model, divided by assets excess liabilities (as per Solvency II valuation rules). As the risk profile of each business model would suggest, the variability around the average SCR ratio is largest for the non-life insurers, followed by life insurers, with the least volatile being the composite insurers.

click to enlarge

For some reason, which I can’t completely comprehend, the EIOPA Financial Stability report highlights differences in the SCR breakdown (as per the standard formula, expressed as a % of net basic SCR) across countries, as per the graph below, assumingly due to the different profiles of each country’s insurance sector.

click to enlarge

A review across several SFCRs from the larger European insurers and reinsurers who use internal models to calculate their SCRs highlights the differences in their risk profiles. A health warning on any such comparison should be stressed given the different risk categories and modelling methodologies used by each firm (the varying treatment of asset credit risk or business/operational risk are good examples of the differing approaches). The graph below shows each main risk category as a percentage of the undiversified total SCR.

click to enlarge

By way of putting the internal model components in context, the graph below shows the SCR breakdown as a percentage of total assets (which obviously reflects insurance liabilities and the associated capital held against same). This comparison is also fraught with difficulty as an (re)insurers’ total assets is not necessarily a reliable measure of extreme insurance exposure in the same way as risk weighted assets is for banks (used as the denominator in bank capital ratios). For example, some life insurers can have low insurance related liabilities and associated assets (e.g. for mortality related business) compared to other insurance products (e.g. most non-life exposures).

Notwithstanding that caveat, the graph below shows a marked difference between firms depending upon whether they are a reinsurer or insurer, or whether they are a life, non-life or composite insurer (other items such as retail versus commercial business, local or cross-border, specialty versus homogeneous are also factors).

click to enlarge

Initial reactions by commentators on the insurance sector to the disclosures by European insurers through SFCRs have been mixed. Some have expressed disappointment at the level and consistency of detail being disclosed. Regulators will have their hands full in ensuring that sufficiently robust standards relating to such disclosures are met.

Regulators will also have to ensure a fair and consistent approach across all European jurisdictions is adopted in calculating SCRs, particularly for those calculated using internal models, whilst avoiding the pitfall of forcing everybody to use the same assumptions and methodology. Recent reports suggest that EIOPA is looking for a greater role in approving all internal models across Europe. Systemic model risk under the proposed Basel II banking regulatory rules published in 2004 is arguably one of the contributors to the financial crisis.

Only time will tell if Solvency II has avoided the mistakes of Basel II in the handling of such beautiful models.