Ramblin’ on Gamblin’

If you exclude investing, I am not a gambler. However I do find the gambling sector fascinating. I have been posting on the sector for over four years now (see posts under Gambling Sector category). As an example of an old bricks and mortar sector that has been revolutionised in recent years by the internet and smart phones, it is illuminating. As I said in a previous post, “this sector is haunted by regulatory risk” and this post will run through some regulatory developments, as well as business ones.

Late in December last year, Ladbrokes Coral (LCL) agreed to a takeover deal by GVC, the Isle of Man consolidator who owns BWIN, Sportbet, PartyPoker and Foxy Bingo. The smaller GVC, with 2017 revenue of €0.9 billion, structured an innovative deal for the larger LCL, with 2017 revenue of approx. £2.4 billion (I will update these figures when LCL announces its final 2017 figures in the coming days), with a sliding scale valuation based upon the UK Government’s triennial review of the sector.

The UK regulator and the government’s adviser on the issue, the Gambling Commission, today released its advice on fixed-odds betting terminals (FOBTs), often described as the “crack cocaine” of gambling. The Gambling Commission recommended a limit of £2 for “slot” style games, called B2 slots, and “a stake limit at or below £30” for other non-slot B2 games, such as the popular roulette games. Although LCL and William Hill shares popped today, the final decision is a political one and its by no means certain that a limit lower than £30 will not be implemented.

Based upon the sliding scale in the LCL/GVC deal and some assumptions on retail operating cost cuts based upon different FOBT stake limits, the graphic below shows the potential impact upon the business of LCL, William Hill (WMH) and Paddy Power Betfair (PPB). Again, LCL is based upon H1 results extrapolated and will be updated for the final 2017 figures.

click to enlarge

Based upon very rough estimates, the limits recommended could result in around 400 to 700 betting shops disposals or closures by the bigger firms, albeit that these shops are likely to be the least attractive for rivals or smaller firms. These estimates do not take into account potential mitigating actions undertaken by the betting firms. Lost FOBT revenue could be made up by increased sports betting facilitated by the introduction of self-service betting terminals (SSBTs) which allow punters to gamble on new betting products.

Point of consumption (PoC) taxes have been introduced in countries such as the UK and Ireland in recent years and are now payable in South Australia and have been announced in Western Australia. The other states in Australia are likely to introduce PoC taxes in 2019. These developments caused WMH to take a write-down on its Australian operations and sell them in March to the Canadian poker firm the Stars Group (formally the colourful Amaya), owner of PokerStars, PokerStars Casino, BetStars, and Full Tilt. The Stars Group (TSG) also increased its ownership in the Australian operator Crownbet in March which it intends to merge with the William Hill Australian operation. PPB was reported to have been interested in Crownbet previously but was obviously beaten on price by TSG. PPB had the exhibit below in their 2017 results presentation on the non-retail Australian market.

click to enlarge

A more positive regulatory development in the coming months could be a favourable decision by the US Supreme Court on the future of the Professional and Amateur Sports Protection Act of 1992 (PASPA). This whitepaper from the Massachusetts Gaming Commission gives a good insight into the legal issues under consideration and the implications for the sector in the US of different Supreme Court decisions, such as upholding PASPA or a narrow ruling or a full PASPA strike down. Other issues in the US include the terms under which individual States legislate for betting. The consultants Eilers & Krejcik opine that “a market incorporating both land-based and online sports betting products could be worth over two times a market that is restricted to land-based sports betting alone” although they conclude that “many – perhaps even most – states will choose to delay or forgo online”. It may be likely that many States will follow Nevada’s example and require online accounts to be initiated by a land-based provider with age and ID verification conducted on premises.

Sports betting in the US is generally low margin with WMH reporting US gross win margins around 6%. Other tailwinds to the US sector include rent seeking participants such as the sports bodies looking for “integrity fees”, a figure of 1% on the amount staked (called the handle in the US) have been suggested, or aggressive tax policies and levels by individual States. This paper by Michelle Minton outlines some fascinating background on PASPA and argues that any legalisation of betting across the US must be pitched at a level to counter the illegal market, estimated at $120 billion per year to be 20 times the size of the current legal sector in the US.

Philip Bowcock, the CEO of William Hill, in the 2017 results call summarised the opportunity in the US as follows:

“We do not quite know how the economics will work out because, as I said, there are three ways this could potentially go. It could either go purely retail, only taking sports bets in a retail environment. It could be the Nevada model, which is retail plus a mobile app signed up for in the retail environment. Or it could go completely remote registration, which is as we have in the UK. I do not expect every state in the US to regulate, and if they do, to go for that end model. I think each one is going to be different, and that is going to decide where we are as to what the economics are going to be.”

So, they are some of the regulatory issues challenging the sector today. In terms of historical and 2017 sportsbook margins shown below, I have spend some time revisiting my data and extracting more accurate data, particularly in relation to historical Ladbroke sportsbook net revenue margins. H2 sportsbook results, particularly Q4 results, were very favourable for the bookies. I estimate that the Q4 figures for PPB improved their full year net revenue margin on its sportsbook by 120 basis points. I debated whether to adjust the 2017 figures for the Q4 results but decided against it as the results reflect the volatility of the business and good or bad results should be left alone. It is the gambling business after all!!! As above, the LCL numbers are those extrapolated from H1 results with an uplift for the H2 favourable results and will be updated when the actual results are available.

click to enlarge

On the specific results from PPB, their 2017 EPS came in at £3.98, ahead of my August estimate of £3.72 for 2017 (see post here) but still behind my more optimistic March estimate of £4.14 (see post here). The lacklustre online sportsbook results are a concern (revenues up 8% compared to 14% and 25% for WMH and LCL respectively), as are the declining online gaming revenues. Increases in PoC taxes in Australia will impact operating margins in 2019 and FX will be a tailwind in 2018. Increased IT resources and investments in promoting new products and the Paddy Power brand are the focus of management in 2018, ahead of the World Cup. Discipline on M&A, as demonstrated by walking away from a CrownBet deal, are also highlighted as is potential firepower of £1.2 billion for opportunistic deals.

My new estimates for 2018 & 2019, after factoring in the items above, are £4.36 and £4.51 respectively, as below. These represent earnings multiples around 17 for PPB, not quite as rich as in the past, but justified given the 2017 results and the headwinds ahead. PPB must now show that it can deliver in 2018, a World Cup year, to maintain this diminished but still premium valuation.

click to enlarge

The coming months in this sector will be interesting. The fate of firms such as 888, with a market cap around £1 billion, will likely be in the mix (interesting that its fits within PPB’s budget!). WMH and 888 have tangoed in the past to no avail. Further dances are highly likely by the players in this fascinating sector.

Insurance M&A Pickup

It’s been a while since I posted on the specialty insurance sector and I hope to post some more detailed thoughts and analysis when I get the time in the coming months. M&A activity has picked up recently with the XL/AXA and AIG/Validus deals being the latest examples of big insurers bulking up through M&A. Deloitte has an interesting report out on some of the factors behind the increased activity. The graph below shows the trend of the average price to book M&A multiples for P&C insurers.

click to enlarge

As regular readers will know, my preferred metric is price to tangible book value and the exhibit below shows that the multiples on recent deals are increasing and well above the standard multiple around 1.5X. That said, the prices are not as high as the silly prices of above 2X paid by Japanese insurers in 2015. Not yet anyway!

click to enlarge

Unless there are major synergies, either on the operating side or on the capital side (which seems to be AXA’s justification for the near 2X multiple on the XL deal), I just can’t see how a 2X multiple is justified in a mature sector. Assuming these firms can earn a 10% return on tangible assets over multiple cycles, a 2X multiple equates to 20X earnings!

Time will tell who the next M&A target will be….


More and more business is moving to the cloud and, given the concentration of providers and their interlinkages, it’s creating security challenges. In the US, 15 cloud providers account for 70% of the market.

The National Institute of Standards and Technology (NIST) describes the cloud as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.

 A cloud solution is typically architected with multiple regions, where a region is a geographical location where users can run their resources, and is typically made up of multiple zones. All major cloud providers have multiple regions, located across the globe and within the US. For example, Rackspace has the fewest number of regions at 7 whereas Microsoft Azure has the most at 36.

The industry is projected to grow at a compound annual growth rate of 36% between 2014 and 2026, as per the graph below. Software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) are the types of cloud services sold.

click to enlarge

Control of the underlying cloud infrastructure of networks, servers, operating systems, and storage is the responsibility of the cloud provider, with the user having control over the deployed applications and possibly configuration settings for the application-hosting environment.

Amazingly however, the main responsibility for protecting corporate data in the cloud lies not with the cloud provider but with the cloud customer, unless specifically agreed otherwise. Jay Heiser of Gartner commented that “we are in a cloud security transition period in which focus is shifting from the provider to the customer” and businesses “are learning that huge amounts of time spent trying to figure out if any particular cloud service provider is secure or not has virtually no payback”.

An organisation called the Cloud Security Alliance (CSA) issued its report on the security threats to the cloud.  These include the usual threats such as data breaches, denial of service (DoS), advanced persistent threats (APTs) and malicious insiders. For the cloud, add in threats including insufficient access management, insecure user interfaces (UIs) and application programming interfaces (APIs), and shared technology vulnerabilities.

Cyber security is an important issue today and many businesses, particularly larger business are turning to insurance to mitigate the risks to their organisations, as the graph below on cyber insurance take-up rates shows.

click to enlarge

Lloyds of London recently released an interesting report called Cloud Down that estimated the e-business interruption costs in the US arising from the sustained loss of access to a cloud service provider. The report estimates, using a standard catastrophic modelling framework from AIR, a cyber incident that takes a top 3 cloud provider offline in the US for 3-6 days would result in ground-up loss central estimates between $7-15 billion and insured losses between $1.5-3 billion. By necessity, the assumptions used in the analysis are fairly crude and basic.

Given the number of bad actors in the cyber world, particularly those who may intend to cause maximum disruption, security failings around the cloud could, in my view, result in losses of many multiples of those projected by Lloyds if several cloud providers are taken down for longer periods. And that’s scary.

CenturyLink 2018 Preview

I don’t do this often but as I am travelling this week I thought I’d give some predictions on the Q4 announcement from CenturyLink (CTL) due after market close this Valentine’s Day. As my last post on the topic in August stated, I am taking a wait and see approach on CTL to assess whether enough progress has been made on the integration and balance sheet to safeguard the dividend. Although CTL is down 15% since my last post, the history of LVLT has taught me that extracting costs from a business with (at best) flat-lining revenue will be a volatile road over the coming quarters and years. Add in high debt loads in an increasing interest rate environment and any investment into CTL, with a medium term holding horizon, must be timed to perfection in this market.

The actual results for Q4 matter little, except to see revenue trends for the combined entity, particularly as according to their last 10Q they “expect to recognize approximately $225 million in merger-related transaction costs, including investment banker and legal fees”. They will likely kitchen sink the quarter’s results. The key will be guidance for 2018 which the new management team (e.g. the old LVLT CEO and CFO) have cautioned will only be the 2018 annual range for bottom-line metrics like EBITDA and free cashflow. Although many analysts know the LVLT management’s form, it may take a while for the wider Wall Street to get away from top line trends, particularly in the rural consumer area, and to measuring CTL primarily as a next generation enterprise communication provider.

At a recent investor conference, the CFO Sunit Patel gave some further colour on their targets. Capex would be set at 16% of revenues, the target for margin expansion is 5%-7% over the next 3 to 5 years reflected cost synergies coming into effect faster than previously indicated, and they will refocus the consumer business on higher speeds “more surgically [in terms of return on capital] in areas that have higher population densities, better socioeconomic demographics, better coexistence with businesses and where wireless infrastructure might be needed”. Based upon these targets and assuming average LIBOR of 3% and 4% for 2018 and 2019 respectively plus a flat-line annual revenue for the next 3 years (although a better revenue mix emerges), I estimate a valuation between $20 to $25 per share is justified, albeit with a lot of execution risk on achieving Sunit’s targets.

On guidance for 2018, I am hoping for EBITDA guidance around $9.25 billion and capex of $4 billion. I would be disappointed in EBITDA guidance with a lower mid-point (as would the market in my view). I also estimate cash interest expense of $2.25-$2.5 billion on net debt of $36-$36.5 billion. Dividend costs for the year should be about $2.35 billion.

Wednesday’s result will be interesting, particularly the market’s assessment of the plausibility of management’s targets for this high dividend yielding stock. There is plenty of time for this story to unfold over the coming quarters. For CTL and their people, I hope it’s not a Valentine’s Day massacre.

Follow-up after results:

I am still going through the actual results released on the 14th of February but my initial reaction is that the 2018 EBITDA guidance is a lot lower than I expected. Based upon proforma 2017 EBITDA margin of 36.1%, I factored in a more rapid margin improvement for 2018 to get to the EBITDA figure of $9.25 billion than the approximate 80 basis point improvement implied in the 2018 guidance to get to the mid-point of $8.85 billion. In response to an analyst’s question on the 5%-7% margin improvement expected over the next 3 to 5 years, Sunit responded as follows:

On the EBITDA margin, to your question, I think, in general, we continue to expect to see the EBITDA margin expand nicely over the next 3 to 5 years. I think we said even at the time of the announcement that with synergies and everything pro forma, we should be north of 40% plus EBITDA margins over the next few years and we continue to feel quite confident and comfortable with that. So I think you will see the margin expansion in terms of the basis points that you described.

I will go through the figures (and maybe the 10K) to revise my estimates and post my conclusion in the near future.

The Bionic Invisible Hand

Technology is omnipresent. The impacts of technology on markets and market structures are a topic of much debate recently. Some point to its influence to explain the lack of volatility in equity markets (ignoring this week’s wobble). Marko Kolanovic, a JPMorgan analyst, has been reported to have estimated that a mere 10% US equity market trading is now conducted by discretionary human traders.

The first wave of high frequency trading (HFT) brought about distortive practises by certain players such as front running and spoofing, as detailed in Michael Lewis’s bestselling exposé Flash Boys. Now HFT firms are struggling to wring profits from the incremental millisecond, as reported in this FT article, with 2017 revenues for HFT firms trading US stocks falling below $1 billion in 2017 from over $7 billion in 2009, according to the consultancy Tabb Group. According to Doug Duquette of Vertex Analytics “it has got to the point where the speed is so ubiquitous that there really isn’t much left to get”.

The focus now is on the impact of various rules-based automatic investment systems, ranging from exchange traded funds (ETFs) to computerised high-speed trading programs to new machine learning and artificial intelligence (AI) innovations. As Tom Watson said about HFT in 2011, these new technologies have the potential to give “Adam Smith’s invisible hand a bionic upgrade by making it better, stronger and faster like Steve Austin in the Six Million Dollar Man”.

As reported in another FT article, some experts estimate that computers are now generating around 50% to 70% of trading in equity markets, 60% of futures and more than 50% of treasuries. According to Morningstar, by year-end 2017 the total assets of actively managed funds stood at $11.4 trillion compared with $6.7 trillion for passive funds in the US.

Although the term “quant fund” covers a multitude of mutual and hedge fund strategies, assuming certain classifications are estimated to manage around $1 trillion in assets out of total assets under management (AUM) invested in mutual funds globally of over $40 trillion. It is believed that machine learning or AI only drives a small subset of quant funds’ trades although such systems are thought to be used as investment tools for developing strategies by an increasing number of investment professionals.

Before I delve into these issues further, I want to take a brief detour into the wonderful world of quantitative finance expert Paul Wilmott and his recent book, with David Orrell, called “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took Over the Markets”. I am going to try to summarize the pertinent issues highlighted by the authors in the following sequence of my favourite quotes from the book:

“If anybody can flog an already sick horse to death, it is an economist.”

“Whenever a model becomes too popular, it influences the market and therefore tends to undermine the assumptions on which it was built.”

“Real price data tend to follow something closer to a power-law distribution and are characterized by extreme events and bursts of intense volatility…which are typical of complex systems that are operating at a state known as self-organized criticality…sometimes called the edge of chaos.”

“In quantitative finance, the weakest links are the models.”

“The only half decent, yet still toy, models in finance are the lognormal random walk models for those instruments whose level we don’t care about.”

“The more apparently realistic you make a model, the less useful it often becomes, and the complexity of the equations turns the model into a black box. The key then is to keep with simple models, but make sure that the model is capturing the key dynamics of the system, and only use it within its zone of validity.”

“The economy is not a machine, it is a living, organic system, and the numbers it produces have a complicated relationship with the underlying reality.”

“Calibration is a simple way of hiding model risk, you choose the parameters so that your model superficially appears to value everything correctly when really, it’s doing no such thing.”

“When their [quants] schemes, their quantitative seizing – cratered, the central banks stepped in to fill the hole with quantitative easing.”

“Bandwagons beget bubbles, and bubbles beget crashes.”

“Today, it is the risk that has been created by high speed algorithms, all based on similar models, all racing to be the first to do the same thing.”

“We have outsourced ethical judgments to the invisible hand, or increasingly to algorithms, with the result that our own ability to make ethical decisions in economic matters has atrophied.”

According to Morningstar’s annual fund flow report, flows into US mutual funds and ETFs reached a record $684.6 billion in 2017 due to massive inflows into passive funds. Among fund categories, the biggest winners were passive U.S. equity, international equity and taxable bond funds with each having inflows of more than $200 billion. “Indexing is no longer limited to U.S. equity and expanding into other asset classes” according to the Morningstar report.

click to enlarge

Paul Singer of Elliott hedge fund, known for its aggressive activism and distressed debt focus (famous for its Argentine debt battles), dramatically said “passive investing is in danger of devouring capitalism” and called it “a blob which is destructive to the growth-creating and consensus-building prospects of free market capitalism”.

In 2016, JP Morgan’s Nikolaos Panagirtzoglou stated that “the shift towards passive funds has the potential to concentrate investments to a few large products” and “this concentration potentially increases systemic risk making markets more susceptible to the flows of a few large passive products”. He further stated that “this shift exacerbates the market uptrend creating more protracted periods of low volatility and momentum” and that “when markets eventually reverse, the correction becomes deeper and volatility rises as money flows away from passive funds back towards active managers who tend to outperform in periods of weak market performance”.

The International Organization of Securities Commissions (IOSCO), proving that regulators are always late to the party (hopefully not too late), is to broaden its analysis on the ETF sector in 2018, beyond a previous review on liquidity management, to consider whether serious market distortions might occur due to the growth of ETFs, as per this FT article. Paul Andrews, a veteran US regulator and secretary general of IOSCO, called ETFs “financial engineering at its finest”, stated that “ETFs are [now] a critical piece of market infrastructure” and that “we are on autopilot in many respects with market capitalisation-weighted ETFs”.

Artemis Capital Management, in this report highlighted in my previous post, believe that “passive investing is now just a momentum play on liquidity” and that “large capital flows into stocks occur for no reason other than the fact that they are highly liquid members of an index”. Artemis believes that “active managers serve as a volatility buffer” and that if such a buffer is withdrawn then “there is no incremental seller to control overvaluation on the way up and no incremental buyer to stop a crash on the way down”.

Algorithmic trading (automated trading, black-box trading, or simply algo-trading) is the process of using computers programmed to follow a defined set of instructions for placing a trade in order to generate profits at a speed and frequency that is impossible for a human trader.

Machine learning uses statistical techniques to infer relationships between data. The artificial intelligence “agent” does not have an algorithm to tell it which relationships it should find but infers, or learns if you like, from the data using statistical analysis to revise its hypotheses. In supervised learning, the machine is presented with examples of input data together with the desired output. The AI agent works out a relationship between the two and uses this relationship to make predictions given further input data. Supervised learning techniques, such as Bayesian regression, are useful where firms have a flow of input data and would like to make predictions.

Unsupervised learning, in contrast, does without learning examples. The AI agent instead tries to find relationships between input data by itself. Unsupervised learning can be used for classification problems determining which data points are similar to each other. As an example of unsupervised learning, cluster analysis is a statistical technique whereby data or objects are classified into groups (clusters) that are similar to one another but different from data or objects in other clusters.

Firms like Bloomberg use cluster analysis in their liquidity assessment tool which aims to cluster bonds with sufficiently similar behaviour so their historical data can be shared and used to make general predictions for all bonds in that cluster. Naz Quadri of Bloomberg, with the wonderful title of head of quant engineering and research, said that “some applications of clustering were more useful than others” and that their analysis suggests “clustering is most useful, and results are more stable, when it is used with a structural market impact model”. Market impact models are widely used to minimise the effect of a firm’s own trading on market prices and are an example of machine learning in practise.

In November 2017, the Financial Stability Board (FSB) released a report called “Artificial Intelligence and Machine Learning in Financial Services”. In the report the FSB highlighted some of the current and potential use cases of AI and machine learning, as follows:

  • Financial institutions and vendors are using AI and machine learning methods to assess credit quality, to price and market insurance contracts, and to automate client interaction.
  • Institutions are optimising scarce capital with AI and machine learning techniques, as well as back-testing models and analysing the market impact of trading large positions.
  • Hedge funds, broker-dealers, and other firms are using AI and machine learning to find signals for higher (and uncorrelated) returns and optimise trading execution.
  • Both public and private sector institutions may use these technologies for regulatory compliance, surveillance, data quality assessment, and fraud detection.

The FSB report states that “applications of AI and machine learning could result in new and unexpected forms of interconnectedness” and that “the lack of interpretability or ‘auditability’ of AI and machine learning methods has the potential to contribute to macro-level risk”. Worryingly they say that “many of the models that result from the use of AI or machine learning techniques are difficult or impossible to interpret” and that “many AI and machine learning developed models are being ‘trained’ in a period of low volatility”. As such “the models may not suggest optimal actions in a significant economic downturn or in a financial crisis, or the models may not suggest appropriate management of long-term risks” and “should there be widespread use of opaque models, it would likely result in unintended consequences”.

With increased use of machine learning and AI, we are seeing the potential rise of self-driving investment vehicles. Using self-driving cars as a metaphor, Artemis Capital highlights that “the fatal flaw is that your driving algorithm has never seen a mountain road” and that “as machines trade with against each other, self-reflexivity is amplified”. Others point out that machine learning in trading may involve machine learning algorithms learning the behaviour of other machine learning algorithms, in a regressive loop, all drawing on the same data and the same methodology. 13D Research opined that “when algorithms coexist in complex systems with subjectivity and unpredictability of human behaviour, unforeseen and destabilising downsides result”.

It is said that there is nothing magical about quant strategies. Quantitative investing is an approach for implementing investment strategies in an automated (or semi-automated) way. The key seems to be data, its quality and its uniqueness. A hypothesis is developed and tested and tested again against various themes to identify anomalies or inefficiencies. Jim Simons of Renaissance Technologies (called RenTec), one of the oldest and most successful quant funds, said that the “efficient market theory is correct in that there are no gross inefficiencies” but “we look at anomalies that may be small in size and brief in time. We make our forecast. Then, shortly thereafter, we re-evaluate the situation and revise our forecast and our portfolio. We do this all-day long. We’re always in and out and out and in. So we’re dependent on activity to make money“. Simons emphasised that RenTec “don’t start with models” but “we start with data” and “we don’t have any preconceived notions”. They “look for things that can be replicated thousands of times”.

The recently departed co-CEO Robert Mercer of RenTec [yes the Mercer who backs Breitbart which adds a scary political Big Brother surveillance angle to this story] has said “RenTec gets a trillion bytes of data a day, from newspapers, AP wire, all the trades, quotes, weather reports, energy reports, government reports, all with the goal of trying to figure out what’s going to be the price of something or other at every point in the future… The information we have today is a garbled version of what the price is going to be next week. People don’t really grasp how noisy the market is. It’s very hard to find information, but it is there, and in some cases it’s been there for a long long time. It’s very close to science’s needle in a haystack problem

Kumesh Aroomoogan of Accern recently said that “quant hedge funds are buying as much data as they can”. The so-called “alternative data” market was worth about $200 million in the US in 2017 and is expected to double in four years, according to research and consulting firm Tabb Group. The explosion of data that has and is becoming available in this technological revolution should keep the quants busy, for a while.

However, what’s scaring me is that these incredibly clever people will inevitably end up farming through the same data sets, coming to broadly similar conclusions, and the machines who have learned each other’s secrets will all start heading for the exits at the same time, in real time, in a mother of all quant flash crashes. That sounds too much like science fiction to ever happen though, right?