Tag Archives: JP Morgan

The Bionic Invisible Hand

Technology is omnipresent. The impacts of technology on markets and market structures are a topic of much debate recently. Some point to its influence to explain the lack of volatility in equity markets (ignoring this week’s wobble). Marko Kolanovic, a JPMorgan analyst, has been reported to have estimated that a mere 10% US equity market trading is now conducted by discretionary human traders.

The first wave of high frequency trading (HFT) brought about distortive practises by certain players such as front running and spoofing, as detailed in Michael Lewis’s bestselling exposé Flash Boys. Now HFT firms are struggling to wring profits from the incremental millisecond, as reported in this FT article, with 2017 revenues for HFT firms trading US stocks falling below $1 billion in 2017 from over $7 billion in 2009, according to the consultancy Tabb Group. According to Doug Duquette of Vertex Analytics “it has got to the point where the speed is so ubiquitous that there really isn’t much left to get”.

The focus now is on the impact of various rules-based automatic investment systems, ranging from exchange traded funds (ETFs) to computerised high-speed trading programs to new machine learning and artificial intelligence (AI) innovations. As Tom Watson said about HFT in 2011, these new technologies have the potential to give “Adam Smith’s invisible hand a bionic upgrade by making it better, stronger and faster like Steve Austin in the Six Million Dollar Man”.

As reported in another FT article, some experts estimate that computers are now generating around 50% to 70% of trading in equity markets, 60% of futures and more than 50% of treasuries. According to Morningstar, by year-end 2017 the total assets of actively managed funds stood at $11.4 trillion compared with $6.7 trillion for passive funds in the US.

Although the term “quant fund” covers a multitude of mutual and hedge fund strategies, assuming certain classifications are estimated to manage around $1 trillion in assets out of total assets under management (AUM) invested in mutual funds globally of over $40 trillion. It is believed that machine learning or AI only drives a small subset of quant funds’ trades although such systems are thought to be used as investment tools for developing strategies by an increasing number of investment professionals.

Before I delve into these issues further, I want to take a brief detour into the wonderful world of quantitative finance expert Paul Wilmott and his recent book, with David Orrell, called “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took Over the Markets”. I am going to try to summarize the pertinent issues highlighted by the authors in the following sequence of my favourite quotes from the book:

“If anybody can flog an already sick horse to death, it is an economist.”

“Whenever a model becomes too popular, it influences the market and therefore tends to undermine the assumptions on which it was built.”

“Real price data tend to follow something closer to a power-law distribution and are characterized by extreme events and bursts of intense volatility…which are typical of complex systems that are operating at a state known as self-organized criticality…sometimes called the edge of chaos.”

“In quantitative finance, the weakest links are the models.”

“The only half decent, yet still toy, models in finance are the lognormal random walk models for those instruments whose level we don’t care about.”

“The more apparently realistic you make a model, the less useful it often becomes, and the complexity of the equations turns the model into a black box. The key then is to keep with simple models, but make sure that the model is capturing the key dynamics of the system, and only use it within its zone of validity.”

“The economy is not a machine, it is a living, organic system, and the numbers it produces have a complicated relationship with the underlying reality.”

“Calibration is a simple way of hiding model risk, you choose the parameters so that your model superficially appears to value everything correctly when really, it’s doing no such thing.”

“When their [quants] schemes, their quantitative seizing – cratered, the central banks stepped in to fill the hole with quantitative easing.”

“Bandwagons beget bubbles, and bubbles beget crashes.”

“Today, it is the risk that has been created by high speed algorithms, all based on similar models, all racing to be the first to do the same thing.”

“We have outsourced ethical judgments to the invisible hand, or increasingly to algorithms, with the result that our own ability to make ethical decisions in economic matters has atrophied.”

According to Morningstar’s annual fund flow report, flows into US mutual funds and ETFs reached a record $684.6 billion in 2017 due to massive inflows into passive funds. Among fund categories, the biggest winners were passive U.S. equity, international equity and taxable bond funds with each having inflows of more than $200 billion. “Indexing is no longer limited to U.S. equity and expanding into other asset classes” according to the Morningstar report.

click to enlarge

Paul Singer of Elliott hedge fund, known for its aggressive activism and distressed debt focus (famous for its Argentine debt battles), dramatically said “passive investing is in danger of devouring capitalism” and called it “a blob which is destructive to the growth-creating and consensus-building prospects of free market capitalism”.

In 2016, JP Morgan’s Nikolaos Panagirtzoglou stated that “the shift towards passive funds has the potential to concentrate investments to a few large products” and “this concentration potentially increases systemic risk making markets more susceptible to the flows of a few large passive products”. He further stated that “this shift exacerbates the market uptrend creating more protracted periods of low volatility and momentum” and that “when markets eventually reverse, the correction becomes deeper and volatility rises as money flows away from passive funds back towards active managers who tend to outperform in periods of weak market performance”.

The International Organization of Securities Commissions (IOSCO), proving that regulators are always late to the party (hopefully not too late), is to broaden its analysis on the ETF sector in 2018, beyond a previous review on liquidity management, to consider whether serious market distortions might occur due to the growth of ETFs, as per this FT article. Paul Andrews, a veteran US regulator and secretary general of IOSCO, called ETFs “financial engineering at its finest”, stated that “ETFs are [now] a critical piece of market infrastructure” and that “we are on autopilot in many respects with market capitalisation-weighted ETFs”.

Artemis Capital Management, in this report highlighted in my previous post, believe that “passive investing is now just a momentum play on liquidity” and that “large capital flows into stocks occur for no reason other than the fact that they are highly liquid members of an index”. Artemis believes that “active managers serve as a volatility buffer” and that if such a buffer is withdrawn then “there is no incremental seller to control overvaluation on the way up and no incremental buyer to stop a crash on the way down”.

Algorithmic trading (automated trading, black-box trading, or simply algo-trading) is the process of using computers programmed to follow a defined set of instructions for placing a trade in order to generate profits at a speed and frequency that is impossible for a human trader.

Machine learning uses statistical techniques to infer relationships between data. The artificial intelligence “agent” does not have an algorithm to tell it which relationships it should find but infers, or learns if you like, from the data using statistical analysis to revise its hypotheses. In supervised learning, the machine is presented with examples of input data together with the desired output. The AI agent works out a relationship between the two and uses this relationship to make predictions given further input data. Supervised learning techniques, such as Bayesian regression, are useful where firms have a flow of input data and would like to make predictions.

Unsupervised learning, in contrast, does without learning examples. The AI agent instead tries to find relationships between input data by itself. Unsupervised learning can be used for classification problems determining which data points are similar to each other. As an example of unsupervised learning, cluster analysis is a statistical technique whereby data or objects are classified into groups (clusters) that are similar to one another but different from data or objects in other clusters.

Firms like Bloomberg use cluster analysis in their liquidity assessment tool which aims to cluster bonds with sufficiently similar behaviour so their historical data can be shared and used to make general predictions for all bonds in that cluster. Naz Quadri of Bloomberg, with the wonderful title of head of quant engineering and research, said that “some applications of clustering were more useful than others” and that their analysis suggests “clustering is most useful, and results are more stable, when it is used with a structural market impact model”. Market impact models are widely used to minimise the effect of a firm’s own trading on market prices and are an example of machine learning in practise.

In November 2017, the Financial Stability Board (FSB) released a report called “Artificial Intelligence and Machine Learning in Financial Services”. In the report the FSB highlighted some of the current and potential use cases of AI and machine learning, as follows:

  • Financial institutions and vendors are using AI and machine learning methods to assess credit quality, to price and market insurance contracts, and to automate client interaction.
  • Institutions are optimising scarce capital with AI and machine learning techniques, as well as back-testing models and analysing the market impact of trading large positions.
  • Hedge funds, broker-dealers, and other firms are using AI and machine learning to find signals for higher (and uncorrelated) returns and optimise trading execution.
  • Both public and private sector institutions may use these technologies for regulatory compliance, surveillance, data quality assessment, and fraud detection.

The FSB report states that “applications of AI and machine learning could result in new and unexpected forms of interconnectedness” and that “the lack of interpretability or ‘auditability’ of AI and machine learning methods has the potential to contribute to macro-level risk”. Worryingly they say that “many of the models that result from the use of AI or machine learning techniques are difficult or impossible to interpret” and that “many AI and machine learning developed models are being ‘trained’ in a period of low volatility”. As such “the models may not suggest optimal actions in a significant economic downturn or in a financial crisis, or the models may not suggest appropriate management of long-term risks” and “should there be widespread use of opaque models, it would likely result in unintended consequences”.

With increased use of machine learning and AI, we are seeing the potential rise of self-driving investment vehicles. Using self-driving cars as a metaphor, Artemis Capital highlights that “the fatal flaw is that your driving algorithm has never seen a mountain road” and that “as machines trade with against each other, self-reflexivity is amplified”. Others point out that machine learning in trading may involve machine learning algorithms learning the behaviour of other machine learning algorithms, in a regressive loop, all drawing on the same data and the same methodology. 13D Research opined that “when algorithms coexist in complex systems with subjectivity and unpredictability of human behaviour, unforeseen and destabilising downsides result”.

It is said that there is nothing magical about quant strategies. Quantitative investing is an approach for implementing investment strategies in an automated (or semi-automated) way. The key seems to be data, its quality and its uniqueness. A hypothesis is developed and tested and tested again against various themes to identify anomalies or inefficiencies. Jim Simons of Renaissance Technologies (called RenTec), one of the oldest and most successful quant funds, said that the “efficient market theory is correct in that there are no gross inefficiencies” but “we look at anomalies that may be small in size and brief in time. We make our forecast. Then, shortly thereafter, we re-evaluate the situation and revise our forecast and our portfolio. We do this all-day long. We’re always in and out and out and in. So we’re dependent on activity to make money“. Simons emphasised that RenTec “don’t start with models” but “we start with data” and “we don’t have any preconceived notions”. They “look for things that can be replicated thousands of times”.

The recently departed co-CEO Robert Mercer of RenTec [yes the Mercer who backs Breitbart which adds a scary political Big Brother surveillance angle to this story] has said “RenTec gets a trillion bytes of data a day, from newspapers, AP wire, all the trades, quotes, weather reports, energy reports, government reports, all with the goal of trying to figure out what’s going to be the price of something or other at every point in the future… The information we have today is a garbled version of what the price is going to be next week. People don’t really grasp how noisy the market is. It’s very hard to find information, but it is there, and in some cases it’s been there for a long long time. It’s very close to science’s needle in a haystack problem

Kumesh Aroomoogan of Accern recently said that “quant hedge funds are buying as much data as they can”. The so-called “alternative data” market was worth about $200 million in the US in 2017 and is expected to double in four years, according to research and consulting firm Tabb Group. The explosion of data that has and is becoming available in this technological revolution should keep the quants busy, for a while.

However, what’s scaring me is that these incredibly clever people will inevitably end up farming through the same data sets, coming to broadly similar conclusions, and the machines who have learned each other’s secrets will all start heading for the exits at the same time, in real time, in a mother of all quant flash crashes. That sounds too much like science fiction to ever happen though, right?

Then again, always look on the bright side……

To recap on the bear case for the US equity market, factors highlighted are high valuation as measured by the cyclically adjusted PE ratio (CAPE) and the high level of corporate earnings that look unsustainable in a historical context. I have tried to capture these arguments in the graph below.

click to enlarge50 year S&P500 PE CAPE real interest rate corp profit&GDPCurrently, the S&P500 PE and the Shiller PE/CAPE are approx 10% and 30% above the average over the past 50 years respectively.

On earnings, Andrew Lapthorne of SocGen, in an August report entitled “To ignore CAPE is to deny mean reversion” concluded that “mean-reversion in earnings, though sometimes delayed, is as undeniable as the economic cycle itself. That peak profits typically accompany peak valuations only reinforces the point. When earnings revert back to mean (and below), the valuation will also collapse.” The graphic below from that report highlights the point.

click to enlargeSocGen Mean Reverting ProfitsThe ever bullish Jeremy Siegel, in a recent conference presentation, again outlined his arguments raised in the August FT article (see Shiller versus Siegel on CAPE post). The fifth edition of his popular book “Stocks for the long run” is out in December. Essentially he argues that CAPE is too pessimistic as accounting changes since 1990 distort historical earnings and the profile of S&P500 earnings has changed with bigger contributions from foreign earnings and less leveraged balance sheets that explain the higher corporate margins.

Siegel contends that after-tax profits published in the National Income and Product Accounts (NIPA) are not distorted by the large write downs from the likes of AOL and AIG. The changing profile of NIPA versus S&P reported earnings through historical downturns illustrate that historical S&P reported earnings are unreliable, as illustrated in the graph below.

click to enlargeNIPA versus S&P reported

However, even using NIPA data, a graphic from JP Morgan in late October shows that currently the S&P500 is approx 20% above its 50 year average.

click to enlargeS&P500 CAPE with NIPASiegel even proposed that current comparison should be against the long term average PE (1954 to 2013) of 19 including only years where interest rates were below 8% (which incidentally is only slightly higher than the 8.2 5o year average used in the first graph of this post).

The ever insightful Cliff Asness, founder of AQR Capital Management, counteracts such analysis with the recent comment below.

Does it seem to anyone else but me that the critics have a reason to exclude everything that might make one say stocks are expensive, and instead pick time periods for comparisons and methods of measurement that will always (adapting on the fly) say stocks are fair or cheap?

However, nothing is as black and white in the real world. The rise in corporate net margins has been real as another recent graphic, this time from Goldman Sachs, shows.

click to enlargeGoldman Sachs S&P500 net margin

Earnings from foreign subsidiaries have increased and S&P500 earnings as a percentage of global GDP show a more stable picture. Also leverage is low compared to historical levels (104% debt to equity for S&P500 compared to a 20 year average of 170%) and cash as a percentage of current assets is also high relative to history (approx 28%). Although there is signs that corporate leverage rates are on the rise again, future interest rate rises should not have as big an impact on corporate margins as they have historically.

JP Morgan, in another October bulletin, showed the breakdown of EPS growth in the S&P500 since 2010, as reproduced below, which clearly indicates a revenue and margin slowdown.

click to enlargeJP Morgan S&P500 EPS Annual Growth Breakdown October 2013David Bianco of Deutsche Bank has recently come up with a fascinating graphic that I have been looking at agog over the past few days (reproduced below). It shows the breakdown of S&P500 returns between earnings growth, dividends and PE multiple expansion.

click to enlargeDeutsche Bank S&P500 Growth BreakdownBianco, who has a  2014 end target of 1850 and a 2015 end target of 2000 for the S&P500, concluded that 75% of the S&P500 rise in 2013 is from PE expansion and that “this is the largest [valuation multiple] contribution to market return since 1998. Before assuming further [multiple] expansion we think it is important that investors be confident in healthy EPS growth next year. Hence, we encourage frequent re-examination of the capex and loan outlook upon new data points.

David Kostin from Goldman Sachs, who have a 2,100 S&P end 2015 target, stated that “multiple expansion was the key U.S. equity market story of 2013. In contrast the 2014 equity return will depend on earnings and money flow rather than further valuation re-rating.

Even well known pessimists like David Rosenberg and Nouriel Roubini are positive albeit cautious. Dr Doom has a 2014 target for S&P500 of 1900 (range 1650 to 1950) although he does give the US equity market an overall neutral rating. Rosenberg, who describes the current rally as “the mother of all liquidity rallies“, cites the US economy’s robustness over the past year as a sign that 2014 should see a further strengthening of the US economy.

So clearly future growth in the S&P500 will depend upon earnings and that will depend upon the economy and interest rates. Although I am still trying to get my head around a fascinating article from 2005 that shows negative correlation between equity returns and GDP growth, that brings me back to the macro-economic situation.

I know this post was to have represented the positive side of the current arguments but, as my current bear instincts can’t be easily dispelled, I have to conclude the post with the comments from Larry Summers at a IMF conference earlier this month that the US may be stuck in a “secular stagnation” and that the lesson from the crisis is “it’s not over until it is over, and that is surely not right now”.

Size of notional CDS market from 2001 to 2012

Sometime during early 2007 I recall having a conversation with a friend who was fretting about the dangers behind the exponential growth in the unregulated credit default swap (CDS) market. His concerns centred on the explosion in rampant speculation in the market by way of “naked” CDS trades (as opposed to covered CDS where the purchaser has an interest in the underlying instrument). The notional CDS market size was then estimated to be considerably higher than the whole of the global bond market (sovereign, municipal, corporate, mortgage and ABS). At the time, I didn’t appreciate what the growth in the CDS market meant. Obviously, the financial crisis dramatically demonstrated the impact!

More recently the London Whale episode at JP Morgan has again highlighted the thin line between the use of CDS for hedging and for speculation. Last week I tried to find a graph that illustrated what had happened to the size of the notional CDS market since the crisis and had to dig through data from the International Swaps and Derivatives Association (ISDA) to come up with the graph below.

click to enlarge

Size of notional CDS market 2001 to 2012

Comparing the size of the notional CDS market to the size of the bond market is a flawed metric as the notional CDS market figures are made up of buyers and sellers (in roughly equal measures) and many CDS can relate to the same underlying bond. Net CDS exposures are only estimated to be a few percent of the overall market today although that comparison ignores the not inconsiderable counterparty risk. Notwithstanding the validity of the comparison, the CDS market of $25 trillion as at the end of 2012 is still considerable compared to the approximate $100 trillion global bond market today. The dramatic changes in the size of both the CDS market (downward) and the bond market (upward) directly reflect the macroeconomic shifts as a result of the financial crisis.

The financial industry lobbied hard to ensure that CDS would not be treated as insurance under the Dodd-Frank reforms although standardized CDS are being moved to clearing houses under the regulations with approximately 10% of notional CDS being cleared in 2012 according to the ISDA. The other initiative to reduce systemic risk is portfolio compression exercises across the OTC swap market whereby existing trades are terminated and restructured in exchange for replacement trades with smaller notional sizes.

Although the industry argues that naked CDS increase the liquidity of the market and aid price discovery, there is mixed research on the topic from the academic world. In Europe, naked CDS on sovereign bonds was banned as a result of the volatility suffered by Greece during the Euro wobbles. The regulatory push of OTC markets to clearing houses does possibly raise new systemic risks associated with concentration of credit risk from clearing houses! Other unintended consequences of the Dodd Franks and Basel III regulatory changes is the futurization of swaps as outlined in Robert Litan’s fascinating article.

Anyway, before I say something silly on a subject I know little about, I just wanted to share the graph above. I had thought that the specialty insurance sector, particularly the property catastrophe reinsurers, may be suited for a variation on a capital structure arbitrage type trade, particularly when many such insurers are increasingly using sub-debt and hybrid instruments in their capital structures (with Solvency II likely to increase the trend) as a recent announcement by Twelve Capital illustrates. I wasn’t primarily focussed on a negative correlation type trade (e.g. long equity/short debt) but more as a way of hedging tail risk on particular natural catastrophe peak zones (e.g. by way of purchasing CDS on debt of a overexposed insurer to a particular zone). Unfortunately, CDS are not available on these mid sized firms (they are on the larger firms like Swiss and Munich Re) and even if they were they would not be available to a small time investor like me!

Lessons not learnt and voices unheard

There have been some interesting articles published over the past week or so to mark the five year anniversary of the Lehman collapse.

Hank Paulson remembered the events of that chaotic time in a BusinessWeek interview. He concluded that despite having a hand in increasing the size of the US banks like JP Morgan and Bank of America (currently the 2nd and 3rd largest global banks by tier 1 capital) “too big to fail is an unacceptable phenomenon”. He also highlighted the risk of incoherence amongst the numerous US and global regulators and that “more still needs to be done with the shadow-banking markets, which I define to be the money-market funds and the so-called repo market, which supplies wholesale funding to banks”.

Another player on the regulatory side, the former chairman of the UK FSA Adair Turner, continued to develop his thoughts on what lessons need to be learnt from the crisis in the article “The Failure of Free Market Finance”, available on the Project Syndicate website. Turner has been talking about these issues in Sweden and London this week (which essentially follow on from his February paper “Debt, Money and Mephistopheles: How Do We Get Out Of This Mess?”). where he argues that there are two key issues which need to be addressed to avert future instability.

The first is how to continue to delever and reduce both private and public debt. Turner believes that “some combination of debt restructuring and permanent debt monetization (quantitative easing that is never reversed) will in some countries be unavoidable and appropriate”. He says that realistic actions need to taken such as writing off Greek debt and a restructuring of Japanese debt. The two graphs below show where we were in terms of private debt in a number of jurisdictions as at the end of 2012 and show that reducing levels of private debt in many developed countries have been offset by increases in public debt over recent years.

click to enlarge Domestic Credit to Private Sector 1960 to 2012

Public and Private Debt as % of GDP OECD US Japan Euro Zone

The second issue that Turner highlights is the need for global measures to ensure we all live in a less credit fuelled world in the future. He states that “what is required is a wide-ranging policy response that combines more powerful countercyclical capital tools than currently planned under Basel 3, the restoration of quantitative reserve requirements to advanced-country central banks’ policy toolkits, and direct borrower constraints, such as maximum loan-to-income or loan-to-value limits, in residential and commercial real-estate lending”.

Turner is arguing for powerful actions. He admits that they effectively mean “a rejection of the pre-crisis orthodoxy that free markets are as valuable in finance as they are in other economic sectors”. I do not see an appetite for such radical actions amongst the political classes nor a consensus amongst policy makers that such a rejection is required. Indeed debt provision outside of the traditional banking systems by way of new distribution channels such as peer to peer lending is an interesting development (see Economist article “Filling the Bank Shaped Hole”)

Indeed the current frothiness in the equity markets, itself a direct result of the on-going (and never ending if the market’s response to the Fed’s decisions this week is anything to go by) loose monetary policy, is showing no signs of abating. Market gurus such as Buffet and Icahn have both come out this week and said the markets are looking overvalued. My post on a possible pullback in September is looking ever more unlikely as the month develops (S&P 500 up 4% so far this month!).

Maybe, just maybe, the 5th anniversary of Lehman’s collapse will allow some of the voices on the need for fundamental structural change in the way we run our economies to be heard. Unfortunately, I doubt it.