Cloudfall

More and more business is moving to the cloud and, given the concentration of providers and their interlinkages, it’s creating security challenges. In the US, 15 cloud providers account for 70% of the market.

The National Institute of Standards and Technology (NIST) describes the cloud as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.

 A cloud solution is typically architected with multiple regions, where a region is a geographical location where users can run their resources, and is typically made up of multiple zones. All major cloud providers have multiple regions, located across the globe and within the US. For example, Rackspace has the fewest number of regions at 7 whereas Microsoft Azure has the most at 36.

The industry is projected to grow at a compound annual growth rate of 36% between 2014 and 2026, as per the graph below. Software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS) are the types of cloud services sold.

click to enlarge

Control of the underlying cloud infrastructure of networks, servers, operating systems, and storage is the responsibility of the cloud provider, with the user having control over the deployed applications and possibly configuration settings for the application-hosting environment.

Amazingly however, the main responsibility for protecting corporate data in the cloud lies not with the cloud provider but with the cloud customer, unless specifically agreed otherwise. Jay Heiser of Gartner commented that “we are in a cloud security transition period in which focus is shifting from the provider to the customer” and businesses “are learning that huge amounts of time spent trying to figure out if any particular cloud service provider is secure or not has virtually no payback”.

An organisation called the Cloud Security Alliance (CSA) issued its report on the security threats to the cloud.  These include the usual threats such as data breaches, denial of service (DoS), advanced persistent threats (APTs) and malicious insiders. For the cloud, add in threats including insufficient access management, insecure user interfaces (UIs) and application programming interfaces (APIs), and shared technology vulnerabilities.

Cyber security is an important issue today and many businesses, particularly larger business are turning to insurance to mitigate the risks to their organisations, as the graph below on cyber insurance take-up rates shows.

click to enlarge

Lloyds of London recently released an interesting report called Cloud Down that estimated the e-business interruption costs in the US arising from the sustained loss of access to a cloud service provider. The report estimates, using a standard catastrophic modelling framework from AIR, a cyber incident that takes a top 3 cloud provider offline in the US for 3-6 days would result in ground-up loss central estimates between $7-15 billion and insured losses between $1.5-3 billion. By necessity, the assumptions used in the analysis are fairly crude and basic.

Given the number of bad actors in the cyber world, particularly those who may intend to cause maximum disruption, security failings around the cloud could, in my view, result in losses of many multiples of those projected by Lloyds if several cloud providers are taken down for longer periods. And that’s scary.

CenturyLink 2018 Preview

I don’t do this often but as I am travelling this week I thought I’d give some predictions on the Q4 announcement from CenturyLink (CTL) due after market close this Valentine’s Day. As my last post on the topic in August stated, I am taking a wait and see approach on CTL to assess whether enough progress has been made on the integration and balance sheet to safeguard the dividend. Although CTL is down 15% since my last post, the history of LVLT has taught me that extracting costs from a business with (at best) flat-lining revenue will be a volatile road over the coming quarters and years. Add in high debt loads in an increasing interest rate environment and any investment into CTL, with a medium term holding horizon, must be timed to perfection in this market.

The actual results for Q4 matter little, except to see revenue trends for the combined entity, particularly as according to their last 10Q they “expect to recognize approximately $225 million in merger-related transaction costs, including investment banker and legal fees”. They will likely kitchen sink the quarter’s results. The key will be guidance for 2018 which the new management team (e.g. the old LVLT CEO and CFO) have cautioned will only be the 2018 annual range for bottom-line metrics like EBITDA and free cashflow. Although many analysts know the LVLT management’s form, it may take a while for the wider Wall Street to get away from top line trends, particularly in the rural consumer area, and to measuring CTL primarily as a next generation enterprise communication provider.

At a recent investor conference, the CFO Sunit Patel gave some further colour on their targets. Capex would be set at 16% of revenues, the target for margin expansion is 5%-7% over the next 3 to 5 years reflected cost synergies coming into effect faster than previously indicated, and they will refocus the consumer business on higher speeds “more surgically [in terms of return on capital] in areas that have higher population densities, better socioeconomic demographics, better coexistence with businesses and where wireless infrastructure might be needed”. Based upon these targets and assuming average LIBOR of 3% and 4% for 2018 and 2019 respectively plus a flat-line annual revenue for the next 3 years (although a better revenue mix emerges), I estimate a valuation between $20 to $25 per share is justified, albeit with a lot of execution risk on achieving Sunit’s targets.

On guidance for 2018, I am hoping for EBITDA guidance around $9.25 billion and capex of $4 billion. I would be disappointed in EBITDA guidance with a lower mid-point (as would the market in my view). I also estimate cash interest expense of $2.25-$2.5 billion on net debt of $36-$36.5 billion. Dividend costs for the year should be about $2.35 billion.

Wednesday’s result will be interesting, particularly the market’s assessment of the plausibility of management’s targets for this high dividend yielding stock. There is plenty of time for this story to unfold over the coming quarters. For CTL and their people, I hope it’s not a Valentine’s Day massacre.

Follow-up after results:

I am still going through the actual results released on the 14th of February but my initial reaction is that the 2018 EBITDA guidance is a lot lower than I expected. Based upon proforma 2017 EBITDA margin of 36.1%, I factored in a more rapid margin improvement for 2018 to get to the EBITDA figure of $9.25 billion than the approximate 80 basis point improvement implied in the 2018 guidance to get to the mid-point of $8.85 billion. In response to an analyst’s question on the 5%-7% margin improvement expected over the next 3 to 5 years, Sunit responded as follows:

On the EBITDA margin, to your question, I think, in general, we continue to expect to see the EBITDA margin expand nicely over the next 3 to 5 years. I think we said even at the time of the announcement that with synergies and everything pro forma, we should be north of 40% plus EBITDA margins over the next few years and we continue to feel quite confident and comfortable with that. So I think you will see the margin expansion in terms of the basis points that you described.

I will go through the figures (and maybe the 10K) to revise my estimates and post my conclusion in the near future.

The Bionic Invisible Hand

Technology is omnipresent. The impacts of technology on markets and market structures are a topic of much debate recently. Some point to its influence to explain the lack of volatility in equity markets (ignoring this week’s wobble). Marko Kolanovic, a JPMorgan analyst, has been reported to have estimated that a mere 10% US equity market trading is now conducted by discretionary human traders.

The first wave of high frequency trading (HFT) brought about distortive practises by certain players such as front running and spoofing, as detailed in Michael Lewis’s bestselling exposé Flash Boys. Now HFT firms are struggling to wring profits from the incremental millisecond, as reported in this FT article, with 2017 revenues for HFT firms trading US stocks falling below $1 billion in 2017 from over $7 billion in 2009, according to the consultancy Tabb Group. According to Doug Duquette of Vertex Analytics “it has got to the point where the speed is so ubiquitous that there really isn’t much left to get”.

The focus now is on the impact of various rules-based automatic investment systems, ranging from exchange traded funds (ETFs) to computerised high-speed trading programs to new machine learning and artificial intelligence (AI) innovations. As Tom Watson said about HFT in 2011, these new technologies have the potential to give “Adam Smith’s invisible hand a bionic upgrade by making it better, stronger and faster like Steve Austin in the Six Million Dollar Man”.

As reported in another FT article, some experts estimate that computers are now generating around 50% to 70% of trading in equity markets, 60% of futures and more than 50% of treasuries. According to Morningstar, by year-end 2017 the total assets of actively managed funds stood at $11.4 trillion compared with $6.7 trillion for passive funds in the US.

Although the term “quant fund” covers a multitude of mutual and hedge fund strategies, assuming certain classifications are estimated to manage around $1 trillion in assets out of total assets under management (AUM) invested in mutual funds globally of over $40 trillion. It is believed that machine learning or AI only drives a small subset of quant funds’ trades although such systems are thought to be used as investment tools for developing strategies by an increasing number of investment professionals.

Before I delve into these issues further, I want to take a brief detour into the wonderful world of quantitative finance expert Paul Wilmott and his recent book, with David Orrell, called “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took Over the Markets”. I am going to try to summarize the pertinent issues highlighted by the authors in the following sequence of my favourite quotes from the book:

“If anybody can flog an already sick horse to death, it is an economist.”

“Whenever a model becomes too popular, it influences the market and therefore tends to undermine the assumptions on which it was built.”

“Real price data tend to follow something closer to a power-law distribution and are characterized by extreme events and bursts of intense volatility…which are typical of complex systems that are operating at a state known as self-organized criticality…sometimes called the edge of chaos.”

“In quantitative finance, the weakest links are the models.”

“The only half decent, yet still toy, models in finance are the lognormal random walk models for those instruments whose level we don’t care about.”

“The more apparently realistic you make a model, the less useful it often becomes, and the complexity of the equations turns the model into a black box. The key then is to keep with simple models, but make sure that the model is capturing the key dynamics of the system, and only use it within its zone of validity.”

“The economy is not a machine, it is a living, organic system, and the numbers it produces have a complicated relationship with the underlying reality.”

“Calibration is a simple way of hiding model risk, you choose the parameters so that your model superficially appears to value everything correctly when really, it’s doing no such thing.”

“When their [quants] schemes, their quantitative seizing – cratered, the central banks stepped in to fill the hole with quantitative easing.”

“Bandwagons beget bubbles, and bubbles beget crashes.”

“Today, it is the risk that has been created by high speed algorithms, all based on similar models, all racing to be the first to do the same thing.”

“We have outsourced ethical judgments to the invisible hand, or increasingly to algorithms, with the result that our own ability to make ethical decisions in economic matters has atrophied.”

According to Morningstar’s annual fund flow report, flows into US mutual funds and ETFs reached a record $684.6 billion in 2017 due to massive inflows into passive funds. Among fund categories, the biggest winners were passive U.S. equity, international equity and taxable bond funds with each having inflows of more than $200 billion. “Indexing is no longer limited to U.S. equity and expanding into other asset classes” according to the Morningstar report.

click to enlarge

Paul Singer of Elliott hedge fund, known for its aggressive activism and distressed debt focus (famous for its Argentine debt battles), dramatically said “passive investing is in danger of devouring capitalism” and called it “a blob which is destructive to the growth-creating and consensus-building prospects of free market capitalism”.

In 2016, JP Morgan’s Nikolaos Panagirtzoglou stated that “the shift towards passive funds has the potential to concentrate investments to a few large products” and “this concentration potentially increases systemic risk making markets more susceptible to the flows of a few large passive products”. He further stated that “this shift exacerbates the market uptrend creating more protracted periods of low volatility and momentum” and that “when markets eventually reverse, the correction becomes deeper and volatility rises as money flows away from passive funds back towards active managers who tend to outperform in periods of weak market performance”.

The International Organization of Securities Commissions (IOSCO), proving that regulators are always late to the party (hopefully not too late), is to broaden its analysis on the ETF sector in 2018, beyond a previous review on liquidity management, to consider whether serious market distortions might occur due to the growth of ETFs, as per this FT article. Paul Andrews, a veteran US regulator and secretary general of IOSCO, called ETFs “financial engineering at its finest”, stated that “ETFs are [now] a critical piece of market infrastructure” and that “we are on autopilot in many respects with market capitalisation-weighted ETFs”.

Artemis Capital Management, in this report highlighted in my previous post, believe that “passive investing is now just a momentum play on liquidity” and that “large capital flows into stocks occur for no reason other than the fact that they are highly liquid members of an index”. Artemis believes that “active managers serve as a volatility buffer” and that if such a buffer is withdrawn then “there is no incremental seller to control overvaluation on the way up and no incremental buyer to stop a crash on the way down”.

Algorithmic trading (automated trading, black-box trading, or simply algo-trading) is the process of using computers programmed to follow a defined set of instructions for placing a trade in order to generate profits at a speed and frequency that is impossible for a human trader.

Machine learning uses statistical techniques to infer relationships between data. The artificial intelligence “agent” does not have an algorithm to tell it which relationships it should find but infers, or learns if you like, from the data using statistical analysis to revise its hypotheses. In supervised learning, the machine is presented with examples of input data together with the desired output. The AI agent works out a relationship between the two and uses this relationship to make predictions given further input data. Supervised learning techniques, such as Bayesian regression, are useful where firms have a flow of input data and would like to make predictions.

Unsupervised learning, in contrast, does without learning examples. The AI agent instead tries to find relationships between input data by itself. Unsupervised learning can be used for classification problems determining which data points are similar to each other. As an example of unsupervised learning, cluster analysis is a statistical technique whereby data or objects are classified into groups (clusters) that are similar to one another but different from data or objects in other clusters.

Firms like Bloomberg use cluster analysis in their liquidity assessment tool which aims to cluster bonds with sufficiently similar behaviour so their historical data can be shared and used to make general predictions for all bonds in that cluster. Naz Quadri of Bloomberg, with the wonderful title of head of quant engineering and research, said that “some applications of clustering were more useful than others” and that their analysis suggests “clustering is most useful, and results are more stable, when it is used with a structural market impact model”. Market impact models are widely used to minimise the effect of a firm’s own trading on market prices and are an example of machine learning in practise.

In November 2017, the Financial Stability Board (FSB) released a report called “Artificial Intelligence and Machine Learning in Financial Services”. In the report the FSB highlighted some of the current and potential use cases of AI and machine learning, as follows:

  • Financial institutions and vendors are using AI and machine learning methods to assess credit quality, to price and market insurance contracts, and to automate client interaction.
  • Institutions are optimising scarce capital with AI and machine learning techniques, as well as back-testing models and analysing the market impact of trading large positions.
  • Hedge funds, broker-dealers, and other firms are using AI and machine learning to find signals for higher (and uncorrelated) returns and optimise trading execution.
  • Both public and private sector institutions may use these technologies for regulatory compliance, surveillance, data quality assessment, and fraud detection.

The FSB report states that “applications of AI and machine learning could result in new and unexpected forms of interconnectedness” and that “the lack of interpretability or ‘auditability’ of AI and machine learning methods has the potential to contribute to macro-level risk”. Worryingly they say that “many of the models that result from the use of AI or machine learning techniques are difficult or impossible to interpret” and that “many AI and machine learning developed models are being ‘trained’ in a period of low volatility”. As such “the models may not suggest optimal actions in a significant economic downturn or in a financial crisis, or the models may not suggest appropriate management of long-term risks” and “should there be widespread use of opaque models, it would likely result in unintended consequences”.

With increased use of machine learning and AI, we are seeing the potential rise of self-driving investment vehicles. Using self-driving cars as a metaphor, Artemis Capital highlights that “the fatal flaw is that your driving algorithm has never seen a mountain road” and that “as machines trade with against each other, self-reflexivity is amplified”. Others point out that machine learning in trading may involve machine learning algorithms learning the behaviour of other machine learning algorithms, in a regressive loop, all drawing on the same data and the same methodology. 13D Research opined that “when algorithms coexist in complex systems with subjectivity and unpredictability of human behaviour, unforeseen and destabilising downsides result”.

It is said that there is nothing magical about quant strategies. Quantitative investing is an approach for implementing investment strategies in an automated (or semi-automated) way. The key seems to be data, its quality and its uniqueness. A hypothesis is developed and tested and tested again against various themes to identify anomalies or inefficiencies. Jim Simons of Renaissance Technologies (called RenTec), one of the oldest and most successful quant funds, said that the “efficient market theory is correct in that there are no gross inefficiencies” but “we look at anomalies that may be small in size and brief in time. We make our forecast. Then, shortly thereafter, we re-evaluate the situation and revise our forecast and our portfolio. We do this all-day long. We’re always in and out and out and in. So we’re dependent on activity to make money“. Simons emphasised that RenTec “don’t start with models” but “we start with data” and “we don’t have any preconceived notions”. They “look for things that can be replicated thousands of times”.

The recently departed co-CEO Robert Mercer of RenTec [yes the Mercer who backs Breitbart which adds a scary political Big Brother surveillance angle to this story] has said “RenTec gets a trillion bytes of data a day, from newspapers, AP wire, all the trades, quotes, weather reports, energy reports, government reports, all with the goal of trying to figure out what’s going to be the price of something or other at every point in the future… The information we have today is a garbled version of what the price is going to be next week. People don’t really grasp how noisy the market is. It’s very hard to find information, but it is there, and in some cases it’s been there for a long long time. It’s very close to science’s needle in a haystack problem

Kumesh Aroomoogan of Accern recently said that “quant hedge funds are buying as much data as they can”. The so-called “alternative data” market was worth about $200 million in the US in 2017 and is expected to double in four years, according to research and consulting firm Tabb Group. The explosion of data that has and is becoming available in this technological revolution should keep the quants busy, for a while.

However, what’s scaring me is that these incredibly clever people will inevitably end up farming through the same data sets, coming to broadly similar conclusions, and the machines who have learned each other’s secrets will all start heading for the exits at the same time, in real time, in a mother of all quant flash crashes. That sounds too much like science fiction to ever happen though, right?

A frazzled Goldilocks?

Whatever measure you look at, equities in the US are overvalued, arguably in bubble territory. Investors poured record amounts into equity funds in recent weeks as the market melt-up takes hold. One of the intriguing features of the bull market over the past 18 months has been the extraordinary low volatility. Hamish Preston of S&P Dow Jones Indices estimated that the average observed 1-month volatility in the S&P 500 in 2017 is “lower than in any other year since 1970”. To illustrate the point, the graph below shows the monthly change in the S&P500 over recent years.

click to enlarge

The lack of any action below 0% since November 2016 and any pullback greater than 2% since January 2016 is striking. “Don’t confuse lack of volatility with stability, ever” is a quote from Nassim Nicolas Taleb that’s seems particularly apt today.

Andrew Lapthorne of SocGen highlighted that low risk markets tend to have a big knock on effect with a “positive feedback mechanism embedded in many risk models”. In other words, the less risk is observed in the market and used as the basis for model inputs, the more risk the quant models allow investors to take! [The impact of quant models and shadow risks from passive investing and machine learning are areas I hope to explore further in a future post.]

One risk that has the potential to spoil the party in 2018 is the planned phased normalisation of monetary policy around the world after the great experimentations of recent years. The market is currently assuming that Central Banks will guarantee that Goldilocks will remain unfrazzled as they deftly steer the ship back to normality. A global “Goldilocks put” if I could plagiarize “the Greenspan put”! Or a steady move away from the existing policy that no greater an economic brain than Donald Trump summarized as being: “they’re keeping the rates down so that everything else doesn’t go down”.

The problem for Central Banks is that if inflation stays muted in the short-term and monetary policy remains loose than the asset bubbles will reach unsustainable levels and require pricking. Or alternatively, any attempt at monetary policy normalization may dramatically show how Central Banks have become the primary providers of liquidity in capital markets and that even modest tightening could result in dangerously imbalances within the now structurally dependent system.

Many analysts (and the number is surprising large) have been warning for some time about the impact of QE flows tightening in 2018. These warnings have been totally ignored by the market, as the lack of volatility illustrates. For example, in June 2017, Citi’s Matt King projected future Central Bank liquidity flows and warned that a “significant unbalancing is coming“. In November 2017, Deutsche Bank’s Alan Ruskin commented that “2018 will see the world’s most important Central Bank balance sheets shift from a 12 month expansion of more than $2 trillion, to a broadly flat position by the end of 2018, assuming the Fed and ECB act according to expectations”. The projections Deutsche Bank produced are below.

click to enlarge

Andrew Norelli of JP Morgan Asset Management in a piece called “Stock, Flow or Impulse?” stated that “It’s still central bank balance sheets, and specifically the flow of global quantitative easing (QE) that is maintaining the buoyancy in financial asset prices”. JP Morgan’s projections of the top 4 developed countries are below.

click to enlarge

Lance Roberts of RealInvestmentAdvice.com produced an interesting graph specifically relating to the Fed’s balance sheet, as below. Caution should be taken with any upward trending metric when compared to the S&P500 in recent years!

click to enlarge

Of course, we have been at pre-taper junctions many times before and every previous jitter has been met with soothing words from Central Banks and more liquidity creation. This time though it feels different. It has to be different. Or Central Bankers risk been viewed as emperors without cloths.

The views of commentators differ widely on this topic. Most of the business media talking heads are wildly positive (as they always are) on the Goldilocks status quo. John Mauldin of MauldinEconomics.com believes the number one risk factor in the US is Fed overreach and too much tightening. Bank of America Merrill Lynch chief investment strategist Michael Hartnett, fears a 1987/1994/1998-style flash crash within the next three months caused by a withdrawal of central bank support as interest rates rise.

Christopher Cole of Artemis Capital Management, in a wonderful report called “Volatility and the Alchemy of Risk”, pulls no punches about the impact of global central banks having pumped $15 trillion in cheap money stimulus into capital markets since 2009. Cole comments that “amid this mania for investment, the stock market has begun self-cannibalizing” and draws upon the image of the ouroboros, an ancient Greek symbol of a snake eating its own tail. Cole estimates that 40% of EPS growth and 30% of US equity gains since 2009 have been as a direct result of the financial engineering use of stock buy backs. Higher interest rates, according to Cole, will be needed to combat the higher inflation that will result from this liquidity bonanza and will cut off the supply for the annual $800 billion of share buybacks. Cole also points to the impact on the high yield corporate debt market and the overall impact on corporate defaults.

Another interesting report, from a specific investment strategy perspective, is Fasanara Capital’s Francesco Filia and the cheerfully entitled “Fragile Markets On The Edge of Chaos”. As economies transition from peak QE to quantitative tightening, Filia “expect markets to face their first real crash test in 10 years” and that “only then will we know what is real and what is not in today’s markets, only then will we be able to assess how sustainable is the global synchronized GDP growth spurred by global synchronized monetary printing”. I like the graphic below from the report.

click to enlarge

I found the reaction to the Trump’s administration misstep on dollar strength interesting this week. Aditya Bhave and Ethan Harris, economists at Bank of America, said of the episode that “the Fed will see the weak dollar as a sign of easy financial conditions and a green light to keep tightening monetary policy”. ECB President Mario Draghi was not happy about the weak dollar statement as that would complicate Europe’s quantitative tightening plans. It was also interesting to hear Benoit Coeure, a hawkish member ECB executive board, saying this week that “it’s happening at different paces across the region, but we are moving to the point where we see wages going up”.

I think many of the Central Banks in developed countries are running out of wriggle room and the markets have yet to fully digest that reality. I fear that Goldilocks is about to get frazzled.

A Riskier World?

This year’s Davos gathering is likely to be dominated by Donald Trump’s presence. I look forward to seeing him barge past other political and industry leaders to get his prime photo opportunity. As US equity markets continue to make all time highs in an unrelentingly fashion, it is scary to see the melt-up market been cheered on by the vivacious talking heads.

Ahead of Davos, the latest World Economic Forum report on global risks was released today. 59% of the contributors to the annual global risks survey point to an increase in risks in 2018, with environmental and cybersecurity risks continuing their trend of growing prominence, as can be seen below.

click to enlarge

Undoubtedly, environmental risks are the biggest generational challenge we face and it is hard to argue with the statement that “we have been pushing our planet to the brink and the damage is becoming increasingly clear“. That said, what is also striking about these assessments (and its important to remember that they are not predictions) is how the economic risks (light blue squares) have, in the opinion of the contributors, receded as top risks in recent years. The report does state that although the “headline economic indicators suggest the world is finally getting back on track after the global crisis that erupted 10 years ago” there is “continuing underlying concerns”.  Amongst these concerns, the report highlights “potentially unsustainable asset prices, with the world now eight years into a bull run; elevated indebtedness, particularly in China; and continuing strains in the global financial system”.

A short article in the report entitled “Cognitive Bias and Risk Management” by Michele Wucker caught my attention. The article included the following:

Risk management starts with identifying and estimating the probability and impact of a given threat. We can then decide whether a risk falls within our tolerance limits and how to react to reduce the risk or at least our exposure to it. Time and again, however, individuals and organizations stumble during this process—for example, failing to respond to obvious but neglected high-impact “grey rhino” risks while scrambling to identify “black swan” events that, by definition, are not predictable.

and

One of the most pervasive cognitive blinders is the availability bias, which leads decision-makers to rely on examples and evidence that come immediately to mind. This draws people’s attention to emotionally salient events ahead of objectively more likely and impactful events.

I do wonder about cognitive blinders and grey rhinos for the year ahead.