Tag Archives: machine learning

The Bionic Invisible Hand

Technology is omnipresent. The impacts of technology on markets and market structures are a topic of much debate recently. Some point to its influence to explain the lack of volatility in equity markets (ignoring this week’s wobble). Marko Kolanovic, a JPMorgan analyst, has been reported to have estimated that a mere 10% US equity market trading is now conducted by discretionary human traders.

The first wave of high frequency trading (HFT) brought about distortive practises by certain players such as front running and spoofing, as detailed in Michael Lewis’s bestselling exposé Flash Boys. Now HFT firms are struggling to wring profits from the incremental millisecond, as reported in this FT article, with 2017 revenues for HFT firms trading US stocks falling below $1 billion in 2017 from over $7 billion in 2009, according to the consultancy Tabb Group. According to Doug Duquette of Vertex Analytics “it has got to the point where the speed is so ubiquitous that there really isn’t much left to get”.

The focus now is on the impact of various rules-based automatic investment systems, ranging from exchange traded funds (ETFs) to computerised high-speed trading programs to new machine learning and artificial intelligence (AI) innovations. As Tom Watson said about HFT in 2011, these new technologies have the potential to give “Adam Smith’s invisible hand a bionic upgrade by making it better, stronger and faster like Steve Austin in the Six Million Dollar Man”.

As reported in another FT article, some experts estimate that computers are now generating around 50% to 70% of trading in equity markets, 60% of futures and more than 50% of treasuries. According to Morningstar, by year-end 2017 the total assets of actively managed funds stood at $11.4 trillion compared with $6.7 trillion for passive funds in the US.

Although the term “quant fund” covers a multitude of mutual and hedge fund strategies, assuming certain classifications are estimated to manage around $1 trillion in assets out of total assets under management (AUM) invested in mutual funds globally of over $40 trillion. It is believed that machine learning or AI only drives a small subset of quant funds’ trades although such systems are thought to be used as investment tools for developing strategies by an increasing number of investment professionals.

Before I delve into these issues further, I want to take a brief detour into the wonderful world of quantitative finance expert Paul Wilmott and his recent book, with David Orrell, called “The Money Formula: Dodgy Finance, Pseudo Science, and How Mathematicians Took Over the Markets”. I am going to try to summarize the pertinent issues highlighted by the authors in the following sequence of my favourite quotes from the book:

“If anybody can flog an already sick horse to death, it is an economist.”

“Whenever a model becomes too popular, it influences the market and therefore tends to undermine the assumptions on which it was built.”

“Real price data tend to follow something closer to a power-law distribution and are characterized by extreme events and bursts of intense volatility…which are typical of complex systems that are operating at a state known as self-organized criticality…sometimes called the edge of chaos.”

“In quantitative finance, the weakest links are the models.”

“The only half decent, yet still toy, models in finance are the lognormal random walk models for those instruments whose level we don’t care about.”

“The more apparently realistic you make a model, the less useful it often becomes, and the complexity of the equations turns the model into a black box. The key then is to keep with simple models, but make sure that the model is capturing the key dynamics of the system, and only use it within its zone of validity.”

“The economy is not a machine, it is a living, organic system, and the numbers it produces have a complicated relationship with the underlying reality.”

“Calibration is a simple way of hiding model risk, you choose the parameters so that your model superficially appears to value everything correctly when really, it’s doing no such thing.”

“When their [quants] schemes, their quantitative seizing – cratered, the central banks stepped in to fill the hole with quantitative easing.”

“Bandwagons beget bubbles, and bubbles beget crashes.”

“Today, it is the risk that has been created by high speed algorithms, all based on similar models, all racing to be the first to do the same thing.”

“We have outsourced ethical judgments to the invisible hand, or increasingly to algorithms, with the result that our own ability to make ethical decisions in economic matters has atrophied.”

According to Morningstar’s annual fund flow report, flows into US mutual funds and ETFs reached a record $684.6 billion in 2017 due to massive inflows into passive funds. Among fund categories, the biggest winners were passive U.S. equity, international equity and taxable bond funds with each having inflows of more than $200 billion. “Indexing is no longer limited to U.S. equity and expanding into other asset classes” according to the Morningstar report.

click to enlarge

Paul Singer of Elliott hedge fund, known for its aggressive activism and distressed debt focus (famous for its Argentine debt battles), dramatically said “passive investing is in danger of devouring capitalism” and called it “a blob which is destructive to the growth-creating and consensus-building prospects of free market capitalism”.

In 2016, JP Morgan’s Nikolaos Panagirtzoglou stated that “the shift towards passive funds has the potential to concentrate investments to a few large products” and “this concentration potentially increases systemic risk making markets more susceptible to the flows of a few large passive products”. He further stated that “this shift exacerbates the market uptrend creating more protracted periods of low volatility and momentum” and that “when markets eventually reverse, the correction becomes deeper and volatility rises as money flows away from passive funds back towards active managers who tend to outperform in periods of weak market performance”.

The International Organization of Securities Commissions (IOSCO), proving that regulators are always late to the party (hopefully not too late), is to broaden its analysis on the ETF sector in 2018, beyond a previous review on liquidity management, to consider whether serious market distortions might occur due to the growth of ETFs, as per this FT article. Paul Andrews, a veteran US regulator and secretary general of IOSCO, called ETFs “financial engineering at its finest”, stated that “ETFs are [now] a critical piece of market infrastructure” and that “we are on autopilot in many respects with market capitalisation-weighted ETFs”.

Artemis Capital Management, in this report highlighted in my previous post, believe that “passive investing is now just a momentum play on liquidity” and that “large capital flows into stocks occur for no reason other than the fact that they are highly liquid members of an index”. Artemis believes that “active managers serve as a volatility buffer” and that if such a buffer is withdrawn then “there is no incremental seller to control overvaluation on the way up and no incremental buyer to stop a crash on the way down”.

Algorithmic trading (automated trading, black-box trading, or simply algo-trading) is the process of using computers programmed to follow a defined set of instructions for placing a trade in order to generate profits at a speed and frequency that is impossible for a human trader.

Machine learning uses statistical techniques to infer relationships between data. The artificial intelligence “agent” does not have an algorithm to tell it which relationships it should find but infers, or learns if you like, from the data using statistical analysis to revise its hypotheses. In supervised learning, the machine is presented with examples of input data together with the desired output. The AI agent works out a relationship between the two and uses this relationship to make predictions given further input data. Supervised learning techniques, such as Bayesian regression, are useful where firms have a flow of input data and would like to make predictions.

Unsupervised learning, in contrast, does without learning examples. The AI agent instead tries to find relationships between input data by itself. Unsupervised learning can be used for classification problems determining which data points are similar to each other. As an example of unsupervised learning, cluster analysis is a statistical technique whereby data or objects are classified into groups (clusters) that are similar to one another but different from data or objects in other clusters.

Firms like Bloomberg use cluster analysis in their liquidity assessment tool which aims to cluster bonds with sufficiently similar behaviour so their historical data can be shared and used to make general predictions for all bonds in that cluster. Naz Quadri of Bloomberg, with the wonderful title of head of quant engineering and research, said that “some applications of clustering were more useful than others” and that their analysis suggests “clustering is most useful, and results are more stable, when it is used with a structural market impact model”. Market impact models are widely used to minimise the effect of a firm’s own trading on market prices and are an example of machine learning in practise.

In November 2017, the Financial Stability Board (FSB) released a report called “Artificial Intelligence and Machine Learning in Financial Services”. In the report the FSB highlighted some of the current and potential use cases of AI and machine learning, as follows:

  • Financial institutions and vendors are using AI and machine learning methods to assess credit quality, to price and market insurance contracts, and to automate client interaction.
  • Institutions are optimising scarce capital with AI and machine learning techniques, as well as back-testing models and analysing the market impact of trading large positions.
  • Hedge funds, broker-dealers, and other firms are using AI and machine learning to find signals for higher (and uncorrelated) returns and optimise trading execution.
  • Both public and private sector institutions may use these technologies for regulatory compliance, surveillance, data quality assessment, and fraud detection.

The FSB report states that “applications of AI and machine learning could result in new and unexpected forms of interconnectedness” and that “the lack of interpretability or ‘auditability’ of AI and machine learning methods has the potential to contribute to macro-level risk”. Worryingly they say that “many of the models that result from the use of AI or machine learning techniques are difficult or impossible to interpret” and that “many AI and machine learning developed models are being ‘trained’ in a period of low volatility”. As such “the models may not suggest optimal actions in a significant economic downturn or in a financial crisis, or the models may not suggest appropriate management of long-term risks” and “should there be widespread use of opaque models, it would likely result in unintended consequences”.

With increased use of machine learning and AI, we are seeing the potential rise of self-driving investment vehicles. Using self-driving cars as a metaphor, Artemis Capital highlights that “the fatal flaw is that your driving algorithm has never seen a mountain road” and that “as machines trade with against each other, self-reflexivity is amplified”. Others point out that machine learning in trading may involve machine learning algorithms learning the behaviour of other machine learning algorithms, in a regressive loop, all drawing on the same data and the same methodology. 13D Research opined that “when algorithms coexist in complex systems with subjectivity and unpredictability of human behaviour, unforeseen and destabilising downsides result”.

It is said that there is nothing magical about quant strategies. Quantitative investing is an approach for implementing investment strategies in an automated (or semi-automated) way. The key seems to be data, its quality and its uniqueness. A hypothesis is developed and tested and tested again against various themes to identify anomalies or inefficiencies. Jim Simons of Renaissance Technologies (called RenTec), one of the oldest and most successful quant funds, said that the “efficient market theory is correct in that there are no gross inefficiencies” but “we look at anomalies that may be small in size and brief in time. We make our forecast. Then, shortly thereafter, we re-evaluate the situation and revise our forecast and our portfolio. We do this all-day long. We’re always in and out and out and in. So we’re dependent on activity to make money“. Simons emphasised that RenTec “don’t start with models” but “we start with data” and “we don’t have any preconceived notions”. They “look for things that can be replicated thousands of times”.

The recently departed co-CEO Robert Mercer of RenTec [yes the Mercer who backs Breitbart which adds a scary political Big Brother surveillance angle to this story] has said “RenTec gets a trillion bytes of data a day, from newspapers, AP wire, all the trades, quotes, weather reports, energy reports, government reports, all with the goal of trying to figure out what’s going to be the price of something or other at every point in the future… The information we have today is a garbled version of what the price is going to be next week. People don’t really grasp how noisy the market is. It’s very hard to find information, but it is there, and in some cases it’s been there for a long long time. It’s very close to science’s needle in a haystack problem

Kumesh Aroomoogan of Accern recently said that “quant hedge funds are buying as much data as they can”. The so-called “alternative data” market was worth about $200 million in the US in 2017 and is expected to double in four years, according to research and consulting firm Tabb Group. The explosion of data that has and is becoming available in this technological revolution should keep the quants busy, for a while.

However, what’s scaring me is that these incredibly clever people will inevitably end up farming through the same data sets, coming to broadly similar conclusions, and the machines who have learned each other’s secrets will all start heading for the exits at the same time, in real time, in a mother of all quant flash crashes. That sounds too much like science fiction to ever happen though, right?

A frazzled Goldilocks?

Whatever measure you look at, equities in the US are overvalued, arguably in bubble territory. Investors poured record amounts into equity funds in recent weeks as the market melt-up takes hold. One of the intriguing features of the bull market over the past 18 months has been the extraordinary low volatility. Hamish Preston of S&P Dow Jones Indices estimated that the average observed 1-month volatility in the S&P 500 in 2017 is “lower than in any other year since 1970”. To illustrate the point, the graph below shows the monthly change in the S&P500 over recent years.

click to enlarge

The lack of any action below 0% since November 2016 and any pullback greater than 2% since January 2016 is striking. “Don’t confuse lack of volatility with stability, ever” is a quote from Nassim Nicolas Taleb that’s seems particularly apt today.

Andrew Lapthorne of SocGen highlighted that low risk markets tend to have a big knock on effect with a “positive feedback mechanism embedded in many risk models”. In other words, the less risk is observed in the market and used as the basis for model inputs, the more risk the quant models allow investors to take! [The impact of quant models and shadow risks from passive investing and machine learning are areas I hope to explore further in a future post.]

One risk that has the potential to spoil the party in 2018 is the planned phased normalisation of monetary policy around the world after the great experimentations of recent years. The market is currently assuming that Central Banks will guarantee that Goldilocks will remain unfrazzled as they deftly steer the ship back to normality. A global “Goldilocks put” if I could plagiarize “the Greenspan put”! Or a steady move away from the existing policy that no greater an economic brain than Donald Trump summarized as being: “they’re keeping the rates down so that everything else doesn’t go down”.

The problem for Central Banks is that if inflation stays muted in the short-term and monetary policy remains loose than the asset bubbles will reach unsustainable levels and require pricking. Or alternatively, any attempt at monetary policy normalization may dramatically show how Central Banks have become the primary providers of liquidity in capital markets and that even modest tightening could result in dangerously imbalances within the now structurally dependent system.

Many analysts (and the number is surprising large) have been warning for some time about the impact of QE flows tightening in 2018. These warnings have been totally ignored by the market, as the lack of volatility illustrates. For example, in June 2017, Citi’s Matt King projected future Central Bank liquidity flows and warned that a “significant unbalancing is coming“. In November 2017, Deutsche Bank’s Alan Ruskin commented that “2018 will see the world’s most important Central Bank balance sheets shift from a 12 month expansion of more than $2 trillion, to a broadly flat position by the end of 2018, assuming the Fed and ECB act according to expectations”. The projections Deutsche Bank produced are below.

click to enlarge

Andrew Norelli of JP Morgan Asset Management in a piece called “Stock, Flow or Impulse?” stated that “It’s still central bank balance sheets, and specifically the flow of global quantitative easing (QE) that is maintaining the buoyancy in financial asset prices”. JP Morgan’s projections of the top 4 developed countries are below.

click to enlarge

Lance Roberts of RealInvestmentAdvice.com produced an interesting graph specifically relating to the Fed’s balance sheet, as below. Caution should be taken with any upward trending metric when compared to the S&P500 in recent years!

click to enlarge

Of course, we have been at pre-taper junctions many times before and every previous jitter has been met with soothing words from Central Banks and more liquidity creation. This time though it feels different. It has to be different. Or Central Bankers risk been viewed as emperors without cloths.

The views of commentators differ widely on this topic. Most of the business media talking heads are wildly positive (as they always are) on the Goldilocks status quo. John Mauldin of MauldinEconomics.com believes the number one risk factor in the US is Fed overreach and too much tightening. Bank of America Merrill Lynch chief investment strategist Michael Hartnett, fears a 1987/1994/1998-style flash crash within the next three months caused by a withdrawal of central bank support as interest rates rise.

Christopher Cole of Artemis Capital Management, in a wonderful report called “Volatility and the Alchemy of Risk”, pulls no punches about the impact of global central banks having pumped $15 trillion in cheap money stimulus into capital markets since 2009. Cole comments that “amid this mania for investment, the stock market has begun self-cannibalizing” and draws upon the image of the ouroboros, an ancient Greek symbol of a snake eating its own tail. Cole estimates that 40% of EPS growth and 30% of US equity gains since 2009 have been as a direct result of the financial engineering use of stock buy backs. Higher interest rates, according to Cole, will be needed to combat the higher inflation that will result from this liquidity bonanza and will cut off the supply for the annual $800 billion of share buybacks. Cole also points to the impact on the high yield corporate debt market and the overall impact on corporate defaults.

Another interesting report, from a specific investment strategy perspective, is Fasanara Capital’s Francesco Filia and the cheerfully entitled “Fragile Markets On The Edge of Chaos”. As economies transition from peak QE to quantitative tightening, Filia “expect markets to face their first real crash test in 10 years” and that “only then will we know what is real and what is not in today’s markets, only then will we be able to assess how sustainable is the global synchronized GDP growth spurred by global synchronized monetary printing”. I like the graphic below from the report.

click to enlarge

I found the reaction to the Trump’s administration misstep on dollar strength interesting this week. Aditya Bhave and Ethan Harris, economists at Bank of America, said of the episode that “the Fed will see the weak dollar as a sign of easy financial conditions and a green light to keep tightening monetary policy”. ECB President Mario Draghi was not happy about the weak dollar statement as that would complicate Europe’s quantitative tightening plans. It was also interesting to hear Benoit Coeure, a hawkish member ECB executive board, saying this week that “it’s happening at different paces across the region, but we are moving to the point where we see wages going up”.

I think many of the Central Banks in developed countries are running out of wriggle room and the markets have yet to fully digest that reality. I fear that Goldilocks is about to get frazzled.

Telecoms’ troubles

The telecom industry is in a funk. S&P recently said that their “global 2017 base-case forecast is for flat revenues” and other analysts are predicting little growth in traditional telecom’s top line over the coming years across most developed markets. This recent post shows that wireless revenue by the largest US firms has basically flatlined with growth of only 1% from 2015 to 2016. Cord cutting in favour of wireless has long been a feature of incumbent wireline firms but now wireless carrier’s lunch is increasingly being eaten by disruptive new players such as Facebook’s messenger, Apple’s FaceTime, Googles’ Hangouts, Skype, Tencent’s QQ or WeChat, and WhatsApp. These competitors are called over the top (OTT) providers and they use IP networks to provide communications (e.g. voice & SMS), content (e.g. video) and cloud-based (e.g. compute and storage) offerings. The telecom industry is walking a fine line between enabling these competitors whilst protecting their traditional businesses.

The graph below from a recent TeleGeography report provides an illustration of what has happened in the international long-distance business.

click to enlarge

A recent McKinsey article predicts that in an aggressive scenario the share of messaging, fixed voice, and mobile voice revenue provided by OTT players could be within the ranges as per the graph below by 2018.

click to enlarge

Before the rapid rise of the OTT player, it was expected that telecoms could recover the loss of revenue from traditional services through increased data traffic over IP networks. Global IP traffic has exploded from 26 exabytes per annum in 2005 to 1.2 zettabytes in 2016 and is projected to grow, by the latest Cisco estimates here, at a CAGR of 24% to 2012. See this previous post on the ever-expanding metrics used for IP traffic (for reference, gigabyte/terabyte/petabyte/exabyte/zettabyte/yottabyte is a kilobyte to the power of 3, 4, 5, 6, 7 and 8 respectively).

According to the 2017 OTT Video Services Study conducted by Level 3 Communications, viewership of OTT video services, including Netflix, Hulu and Amazon Prime, will overtake traditional broadcast TV within the next five years, impacting cable firms and traditional telecom’s TV services alike. With OTT players eating telecom’s lunch, Ovum estimate a drop in spending on traditional communication services by a third over the next ten years.

Telecom and cable operators have long complained of unfair treatment given their investments in upgrading networks to handle the vast increase in data created by the very OTT players that are cannibalizing their revenue. For example, Netflix is estimated to consume as much as a third of total network bandwidth in the U.S. during peak times. Notwithstanding their growth, it’s important to see these OTT players as customers of the traditional telecoms as well as competitors and increasingly telecoms are coming to understand that they need to change and digitalise their business models to embrace new opportunities. The graphic below, not to scale, on changing usage trends illustrates the changing demands for telecoms as we enter the so called “digital lifestyle era”.

click to enlarge

The hype around the internet of things (IoT) is getting deafening. Just last week, IDC predicted that “by 2021, global IoT spending is expected to total nearly $1.4 trillion as organizations continue to invest in the hardware, software, services, and connectivity that enable the IoT”.

Bain & Co argue strongly in this article in February that telecoms, particularly those who have taken digital transformation seriously in their own operating models, are “uniquely qualified to facilitate the delivery of IoT solutions”. The reasons cited include their experience of delivering scale connectivity solutions, of managing extensive directories and the life cycles of millions of devices, and their strong position developing and managing analytics at the edge of the network across a range of industries and uses.

Upgrading network to 5G is seen as being necessary to enable the IoT age and the hype around 5G has increased along with the IoT hype and the growth in the smartphone ecosystem. But 5G is in a development stage and technological standards need to be finalised. S&P commented that “we don’t expect large scale commercial 5G rollout until 2020”.

So what can telecoms do in the interim about declining fundamentals? The answer is for telecoms to rationalise and digitalize their business. A recent McKinsey IT benchmarking study of 80 telecom companies worldwide found that top performers had removed redundant platforms, automated core processes, and consolidated overlapping capabilities. New technologies such as software-defined networks (SDN) and network-function virtualization (NFV) mean telecoms can radically reshape their operating models. Analytics can be used to determine smarter capital spending, machine learning can be used to increase efficiency and avoid overloads, back offices can be automated, and customer support can be digitalized. This McKinsey article claims that mobile operators could double their operating cashflow through digital transformation.

However, not all telecoms are made the same and some do not have a culture that readily embraces transformation. McKinsey say that “experience shows that telcoms have historically only found success in transversal products (for example, security, IoT, and cloud services for regional small and medium-size segments)” and that in other areas, “telcoms have developed great ideas but have failed to successfully execute them”.

Another article from Bain & Co argues that only “one out of eight providers could be considered capital effective, meaning that they have gained at least 1 percentage point of market share each year over the past five years without having spent significantly more than their fair share of capital to do so”. As can be seen below, the rest of the sector is either caught in an efficiency trap (e.g. spent less capital than competitors but not gaining market share) or are just wasteful wit their capex spend.

click to enlarge

So, although there are many challenges for this sector, there is also many opportunities. As with every enterprise in this digital age, it will be those firms who can execute at scale that will likely to be the big winners. Pure telecommunications companies could become extinct or so radically altered in focus and diversity of operations that telecoms as a term may be redundant. Content production could be mixed with delivery to make joint content communication giants. Or IT services such as security, cloud services, analytics, automation and machine learning could be combined with next generation intelligent networks. Who knows! One thing is for sure though, the successful firms will be the ones with management teams that can execute a clear strategy profitably in a fast changing competitive sector.

Pimping the Peers (Part 1)

Fintech is a much hyped term currently that covers an array of new financial technologies. It includes technology providers of financial services, new payment technologies, mobile money and currencies like bitcoin, robo-advisers, crowd funding and peer to peer (P2P) lending. Blockchain is another technology that is being hyped with multiple potential uses. I posted briefly on the growth in P2P lending and crowd-funding before (here and here) and it’s the former that is primarily the focus of this post.

Citigroup recently released an interesting report on the digital disruption impact of fintech on banking which covers many of the topics above. The report claims that $19 billion has been invested in fintech firms in 2015, with the majority focussed in the payments area. In terms of the new entrants into the provision of credit space, the report highlights that over 70% of fintech investments to date have being in the personal and SME business segments.

In the US, Lending Club and Prosper are two of the oldest and more established firms in the marketplace lending sector with a focus on consumer lending. Although each are growing rapidly and have originated loans in the multiple of billions in 2015, the firms have been having a rough time of late with rates being increased to counter poor credit trends. Public firms have suffered from the overall negative sentiment on banks in this low/negative interest rate environment. Lending Club, which went public in late 2014, is down about 70% since then whilst Prosper went for institutional investment instead of an IPO last year. In fact, the P2P element of the model has been usurped as most of the investors are now institutional yield seekers such as hedge funds, insurers and increasingly traditional banks. JP Morgan invested heavily in another US firm called OnDeck, an online lending platform for small businesses, late in 2015. As a result, marketplace lending is now the preferred term for the P2P lenders as the “peer” element has faded.

Just like other disruptive models in the technology age, eBay and Airbnb are examples, initially these models promised a future different from the past, the so called democratization of technology impact, but have now started to resemble new technology enabled distribution platforms with capital provided by already established players in their sectors. Time and time again, digital disruption has eroded distribution costs across many industries. The graphic from the Citi report below on digital disruption impact of different industries is interesting.

click to enlargeDigital Disruption

Marketplace lending is still small relative to traditional banking and only accounts for less than 1% of loans outstanding in the UK and the US (and even in China where its growth has been the most impressive at approx 3% of retail loans). Despite its tiny size, as with any new financial innovation, concerns are ever-present about the consequences of change for traditional markets.

Prosper had to radically change its underwriting process after a shaky start. One of their executives is recently quoted as saying that they “will soon be on our sixth risk model”. Marrying new technology with quality credit underwriting expertise (ignoring the differing cultures of each discipline) is a key challenge for these fledging upstarts. An executive in Kreditech, a German start-up, claimed that they are “a tech company who happens to be doing lending”. Critics point to the development of the sector in a benign default environment with low interest rates where borrowers can easily refinance and the churning of loans is prevalent. Adair Turner, the ex FSA regulator, recently stirred up the new industry with the widely reported comment that “the losses which will emerge from peer-to-peer lending over the next five to 10 years will make the bankers look like lending geniuses”. A split of the 2014 loan portfolio of Lending Club in the Citi report as below illustrates the concern.

click to enlargeLending Club Loan By Type

Another executive from the US firm SoFi, focused on student loans, claims that the industry is well aware of the limitations that credit underwriting solely driven by technology imbues with the comment that “my daughter could come up with an underwriting model based upon which band you like and it would work fine right now”.  Some of the newer technology firms make grand claims involving superior analytics which, combined with technologies like behavioural economics and machine learning, they contend will be able to sniff out superior credit risks.

The real disruptive impact that may occur is that these newer technology driven firms will, as Antony Jenkins the former CEO of Barclays commented, “compel banks to significantly automate their business”. The Citigroup report has interesting statistics on the traditional banking model, as per the graphs below. 60% to 70% of employees in retail banking, the largest profit segment for European and US banks, are supposedly doing manual processing which can be replaced by automation.

click to enlargeBanking Sector Forecasts Citi GPS

Another factor driving the need to automate the banks is the cyber security weaknesses in patching multiple legacy systems together. According to the Citigroup report, “the US banks on average appear to be about 5 years behind Europe who are in turn about a decade behind Nordic banks”. Within Europe, it is interesting to look at the trends in bank employee figures in the largest markets, as per the graph below. France in particular looks to be out of step with other countries.

click to enlargeEuropean Bank Employees

Regulators are also starting to pay attention. Just this week, after a number of scams involving online lenders, the Chinese central bank has instigated a crack down and constituted a multi-agency task force. In the US, there could be a case heard by the Supreme Court which may create significant issues for many online lenders. The Office of the Comptroller of the Currency recently issued a white paper to solicit industry views on how such new business models should be regulated. John Williams of the San Francisco Federal Reserve recently gave a speech at a recent marketplace lending conference which included the lucid point that “as a matter of principle, if it walks like a duck and quacks like a duck, it should be regulated like a duck”.

In the UK, regulators have taken a gentler approach whereby the new lending business models apply for Financial Conduct Authority authorisation under the 36H regulations, which are less stringent than the regimes which apply to more established activities, such as collective investment schemes. The FCA also launched “Project Innovate” last year where new businesses work together with the FCA on their products in a sandbox environment.

Back in 2013, I asked the question whether financial innovation always ended in lower risk premia in this post. In the reinsurance sector, the answer to that question is yes in relation to insurance linked securities (ILS) as this recent post on current pricing shows. It has occurred to me that the new collateralised ILS structures are not dissimilar in methodology to the 100% reserve banks, under the so-called Chicago plan, which economists such as Irving Fisher, Henry Simons and Milton Friedman proposed in the 1930s and 1940s. I have previously posted on my difficulty in understanding how the fully collaterised insurance model can possibly accept lower risk premia than the traditional “fractional” business models of traditional insurers (as per this post). The reduced costs of the ILS model or the uncorrelated diversification for investors cannot fully compensate for the higher capital required, in my view. I suspect that the reason is hiding behind a dilution of underwriting standards and/or leverage being used by investors to juice their returns. ILS capital is now estimated to make up 12% of overall reinsurance capital and its influence on pricing across the sector has been considerable. In Part 2 of this post, I will look into some of the newer marketplace insurance models being developed (it also needs a slick acronym – InsurTech).

Marketplace lending is based upon the same fully capitalized idea as ILS and 100% reserve banks. As can be seen by the Citigroup exhibits, there is plenty of room to compete with the existing banks on costs although nobody, not yet anyway, is claiming that such models have a lower cost of capital than the fractional reserve banks. It is important not to over exaggerate the impact of new models like marketplace lending on the banking sector given its current immaterial size. The impact of technology on distribution channels and on credit underwriting is likely to be of greater significance.

The indirect impact of financial innovation on underwriting standards prior to the crisis is a lesson that we must learn. To paraphrase an old underwriting adage, we should not let the sweet smell of shiny new technology distract us from the stink of risk, particularly where such risk involves irrational human behaviour. The now infamous IMF report in 2006 which stated that financial innovation had “increased the resilience of the financial system” cannot be forgotten.

I am currently reading a book called “Between Debt and the Devil” by the aforementioned Adair Turner where he argues that private credit creation, if left solely to the free market under our existing frameworks, will overfund secured lending on existing real estate (which my its nature is finite), creating unproductive volatility and financial instability as oversupply meets physical constraints. Turner’s book covers many of the same topics and themes as Martin Wolf’s book (see this post). Turner concludes that we need to embrace policies which actively encourage a less credit intensive economy.

It is interesting to see that the contribution of the financial sector has not reduced significantly since the crisis, as the graph on US GDP mix below illustrates. The financialization of modern society does not seem to have abated much since the crisis. Indeed, the contribution to the value of the S&P500 from the financials has not decreased materially since the crisis either (as can be seen in the graph in this post).

click to enlargeUS GDP Breakdown 1947 to 2014

Innovation which makes business more efficient is a feature of the creative destruction capitalist system which has increased productivity and wealth across generations. However, financial innovation which results in changes to the structure of markets, particularly concerning banking and credit creation, has to be carefully considered and monitored. John Kay in a recent FT piece articulated the dangers of our interconnected financial world elegantly, as follow:

Vertical chains of intermediation, which channel funds directly from savers to the uses of capital, can break without inflicting much collateral damage. When intermediation is predominantly horizontal, with intermediaries mostly trading with each other, any failure cascades through the system.

When trying to understand the potential impacts of innovations like new technology driven underwriting, I like to go back to an exhibit I created a few years ago trying to illustrate how  financial systems have been impacted at times of supposed innovation in the past.

click to enlargeQuote Money Train

Change is inevitable and advances in technology cannot, nor should they, be restrained. Human behaviour, unfortunately, doesn’t change all that much and therefore how technological advances in the financial sector could impact stability needs to be ever present in our thoughts. That is particularly important today where global economies face such transformational questions over the future of the credit creation and money.