Tag Archives: Nvidia

Flying High

As the S&P 500 grapples around the 2,800 mark, it has achieved an impressive 12% year to date gain. A pause or a pull-back whilst macro events like Brexit and the US-China trade talks are resolved are a possibility given the near 17 forward PE. I thought it would be worthwhile looking at some of the high flyers in the market to search for value.

I selected a group of 12 stocks that have increased by 25% on average since the beginning of the year. The list is dominated by business software firms that are squarely in the SaaS, cloud and AI hype. Firms like ServiceNow (NOW), Workday (WDAY), Tableau Software (DATA), Splunk (SPLK), Adobe (ADBE), Salesforce (CRM), Palo Alto Networks (PANW) and the smaller Altair Engineering (ALTR). Others included in my sample are Square (SQ), Paypal (PYPL), VMWare (VMW) and my old friend Nvidia (NVDA).

Using data from Yahoo Finance, I compared each of the firm’s valuation, based upon today’s close, using their 2019 projected PE against their PEGs, using projected EPS growth for the next 3 years. The results are below.

click to enlarge

These are not cheap stocks (a PEG at or below 1 is considered undervalued). As per this FT article, the CEO of ServiceNow John Donahoe summed up the market’s love of some of these stocks by saying “investors value, first and foremost, growth”. By any measure, “value” in that quote is an understatement. I have never been good at playing hyped stocks, I just can’t get my head around these valuations. I do think it indicates that the market has got ahead of itself in its love of growth. I am going to focus on the two most “reasonably” valued stocks on a PEG basis in the graph above – Nvidia and Altair – by running my own numbers (I always distrust consensus figures).

I have posted on my journey with Nvidia previously, most recently here in November after their first revenue warning. Amazingly, even after a second big revenue warning in January from ongoing inventory and crypto-mining headwinds, the stock recovered from the 130’s into the 150’s before again trading into the 160’s in recent weeks following the Mellanox merger announcement. NVDA purchased Mellanox, an admired data centre equipment maker, at 25 times 2018 earnings (which seems reasonable given Mellanox is growing revenues at 25%).

NVDA’s recent quarterly results were not only worrying for its near 50% sequential decline in gaming but also for the 14% sequential decline in its data centre business, its second largest segment which was growing strongly. Despite management’s assertion that the gaming segment’s quarterly run rate is $1.4 billion (Q4 was below $1 billion), I am struggling to match analyst revenue estimates for FY2020 and FY2021. The most optimistic figures that I can get to (pre-Mellanox), assuming the crypto-mining boom is removed from the trend, is $10.3 billion and $12.8 billion for FY2020 and FY2021, 8% and 4% less than the consensus (pre-Mellanox), as below.

click to enlarge

Based upon management’s guidance on expenses (it is impressive that nearly 9,500 of their 13,300 employees are engaged in R&D), on the Mellanox deal closing in calendar year Q3 2019, and on 15 million shares repurchased each year, my estimates for EPS for FY2020 and FY2021 are $5.00 and $7.77 respectively (this FY2020 EPS figure is below analyst estimates which exclude any Mellanox contribution). At today’s share price that’s a PE of 33 and 21 for their FY2020 and FY2021. That may look reasonable enough, given the valuations above, for a combined business that will likely grow at 20%+ in the years thereafter. However, NVDA is a firm that has just missed its quarterly numbers by over 30% and it should be treated with a degree of “show me the money”. I think the consensus figures for FY2020 on NVDA are too optimistic so I shall watch NVDA’s progress with interest from the sidelines.

Altair Engineering (ALTR) is not the usual hyped firm. ALTR provide an integrated suite of multi-disciplinary computer aided engineering software that optimizes design performance across various disciplines which recently purchased an AI firm called Datawatch. ALTR is led by the impressive James Scapa and have built a highly specialised platform with significant growth potential. The revenue projections for the firm, including Datawatch and another acquisition SimSolid, with 2018 and prior on an ASC 605 basis and 2019 on an ASC 606 basis are below. The reason for the relatively flat Q/Q is the conversion of the Datawatch business to a SaaS basis and integration into the Altair platforms.

click to enlarge

For 2019 through 2021, my estimates for EPS are $0.62, $0.81 and $1.17 respectively (2019 and 2020 figures are over 10% higher than consensus). At the current share price of $38.32, that’s PE ratios of 63, 47, and 33. A rich valuation indeed. And therein lies the problem with high growth stocks. ALTR is a fantastic firm but its valuation is not. Another one for the watchlist.

A naughty or nice 2019?

They say if you keep making the same prediction, at some stage it will come true. Well, my 2018 post a year ago on the return of volatility eventually proved prescient (I made the same prediction for 2017!). Besides the equity markets (multiple posts with the latest one here), the non-company specific topics covered in this blog in 2018 ranged from the telecom sector (here), insurance (here, here, and here), climate change (here and here), to my own favourite posts on artificial intelligence (here, here and here).

The most popular post (by far thanks to a repost by InsuranceLinked)) this year was on the Lloyds’ of London market (here) and I again undertake to try to post more on insurance specific topics in 2019. My company specific posts in 2018 centered on CenturyLink (CTL), Apple (AAPL), PaddyPowerBetfair (PPB.L), and Nvidia (NVDA). Given that I am now on the side-lines on all these names, except CTL, until their operating results justify my estimate of fair value and the market direction is clearer, I hope to widen the range of firms I will post on in 2019, time permitting. Although this blog is primarily a means of trying to clarify my own thoughts on various topics by means of a public diary of sorts, it is gratifying to see that I got the highest number of views and visitors in 2018. I am most grateful to you, dear reader, for that.

In terms of predictions for the 2019 equity markets, the graph below shows the latest targets from market analysts. Given the volatility in Q4 2018, it is unsurprising that the range of estimates for 2019 is wider than previously. At the beginning of 2018, the consensus EPS estimate for the S&P500 was $146.00 with an average multiple just below 20. Current 2018 estimates of $157.00 resulted in a multiple of 16 for the year end S&P500 number. The drop from 20 to 16 illustrates the level of uncertainty in the current market

click to enlarge

For 2019, the consensus EPS estimate is (currently) $171.00 with an average 2019 year-end target of 2,900 implying a 17 multiple. Given that this EPS estimate of 9% growth includes sectors such as energy with an assumed healthy 10% EPS growth projection despite the oil price drop, it’s probable that this EPS estimate will come down during the upcoming earnings season as firms err on the conservative side for their 2019 projections.

The bears point to building pressures on top-line growth and on record profit margins. The golden boy of the moment, Michael Wilson of Morgan Stanley, calls the current 2019 EPS estimates “lofty”. The bulls point to the newly established (as of last Friday) Powell Put and the likely resolution of the US-China trade spat (because both sides need it). I am still dubious on a significant or timely relaxation of global quantitative tightening and don’t feel particularly inclined to bet money on the Orange One’s negotiating prowess with China. My guess is the Chinese will give enough for a fudge but not enough to satisfy Trump’s narcissistic need (and political need?) for a visible outright victory. The NAFTA negotiations and his stance on the Wall show outcomes bear little relation to the rhetoric of the man. These issues will be the story of 2019. Plus Brexit of course (or as I suspect the lack thereof).

Until we get further insight from the Q4 earnings calls, my current base assumption of 4% EPS growth to $164 with a multiple of 15 to 16 implies the S&P500 will be range bound around current levels of 2,400 – 2,600. Hopefully with less big moves up or down!

Historically, a non-recessionary bear market lasts on average 7 months according to Ed Clissold of Ned Davis Research (see their 2019 report here). According to Bank of America, since 1950 the S&P 500 has endured 11 retreats of 12% or more in prolonged bull markets with these corrections lasting 8 months on average. The exhibit below suggests that such corrections only take 5 months to recover peak to trough.

click to enlarge

To get a feel for the possible direction of the S&P500 over 2019, I looked at the historical path of the index over 300 trading days after a peak for 4 non-recessionary and 4 recessionary periods (remember recessions are usually declared after they have begun), as below.

Note: These graphs have been subsequently updated for the S&P500 close to the 18th January 2019. 

click to enlarges&p500 q42018 drop compared to 4 nonrecession drops in 1962 1987 1998 & 2015 updated

 

click to enlarges&p500 q42018 drop compared to 4 recession drops in 1957 1974 1990 & 2000 updated

 

I will leave it to you, dear reader, to decide which path represents the most likely one for 2019. It is interesting that the 1957 track most closely matches the moves to date  (Ed: as per the date of the post, obviously not after that date!) but history rarely exactly rhymes. I have no idea whether 2019 will be naughty or nice for equity investors. I can predict with 100% certainty that it will not be dull….

Given that Brightwater’s pure Alpha fund has reportingly returned an impressive 14.6% for 2018 net of fees, I will leave the last word to Ray Dalio, who has featured regularly in this blog in 2018, as per his recent article (which I highly recommend):

Typically at this phase of the short-term debt cycle (which is where we are now), the prices of the hottest stocks and other equity-like assets that do well when growth is strong (e.g., private equity and real estate) decline and corporate credit spreads and credit risks start to rise. Typically, that happens in the areas that have had the biggest debt growth, especially if that happens in the largely unregulated shadow banking system (i.e., the non-bank lending system). In the last cycle, it was in the mortgage debt market. In this cycle, it has been in corporate and government debt markets.

When the cracks start to appear, both those problems that one can anticipate and those that one can’t start to appear, so it is especially important to identify them quickly and stay one step ahead of them.

So, it appears to me that we are in the late stages of both the short-term and long-term debt cycles. In other words, a) we are in the late-cycle phase of the short-term debt cycle when profit and earnings growth are still strong and the tightening of credit is causing asset prices to decline, and b) we are in the late-cycle phase of the long-term debt cycle when asset prices and economies are sensitive to tightenings and when central banks don’t have much power to ease credit.

A very happy and healthy 2019 to all.

Clearly wrong

Back at the end of July, in this post on artificial intelligence (AI), I highlighted a few technology stocks related to AI that may be worth looking at in a market downturn. I named Nvidia (NVDA), Google/Alphabet (GOOG) and Baidu (BIDU). Well, I followed through on two of these calls at the end of October and bought into GOOGL and NVDA. I am just still too nervous about investing in a Chinese firm like BIDU given the geopolitical and trade tensions. I am reasonably happy about the GOOGL trade but after their awful results last night I quickly got out of NVDA this morning, taking a 17% hit.

Last quarter CEO Jensen Huang said the following:

A lot of gamers at night, they could — while they’re sleeping, they could do some mining. And so, do they buy it for mining or did they buy it for gaming, it’s kind of hard to say. And some miners were unable to buy our OEM products, and so they jumped on to the market to buy it from retail, and that probably happened a great deal as well. And that all happened in the last — the previous several quarters, probably starting from late Q3, Q4, Q1, and very little last quarter, and we’re projecting no crypto-mining going forward.

Last night, they guided their Q4 gaming revenue down sequentially by a massive $600 million, about a third, to clear inventory of their mid-range Pascal GPU chips and warned that the crypto hangover could take a few quarters to clear. CEO Jensen Huang said “we were surprised, obviously. I mean, we’re surprised by it, as anybody else. The crypto hangover lasted longer than we expected.” That was some surprise!!

All the bull analyst calls on NVDA have been shown up badly here. Goldman Sachs, who only recently put the stock on their high conviction list, quickly withdrew them from the list with the comment that they were “clearly wrong”! My back of the envelop calculations suggest that the 2019 and 2020 consensus EPS estimates of $7.00 and $8.00 pre-last night’s Q3 results could be impacted down by 15% and 20% respectively. Many analysts are only taking their price targets down to the mid to low $200’s. With the stock now trading around the $160s, I could see it going lower, possibly into the $120’s if this horrible market continues. And that’s why I just admitted defeat and got out.

All bad trades, like this NVDA one, teach you something. For me, its don’t get catch up in the hype about a strong secular trend like AI, particularly as we are clearly in a late market cycle. NVDA is a remarkable firm and its positioning in non-gaming markets like data-centres and auto as well as the potential of its new Turing gaming chips mean that it could well be a star of the future. But I really don’t understand the semi-conductor market and investing in a market you really don’t understand means you have to be extremely careful. Risk management and sizing of positions is critical. So, don’t get caught up in hype (here is an outrageous example of AI hype on Micron).

Strangely, I find it a physiological relief to sell a losing position: it means I don’t have to be reminded of the mistake every time I look at my portfolio and I can be more unemotional about ever considering re-entering a stock. I don’t think I will have to consider NVDA again for several quarters!

Lesson learned. Be careful out there.

Heterogeneous Future

It seems like wherever you look these days there is references to the transformational power of artificial intelligence (AI), including cognitive or machine learning (ML), on businesses and our future. A previous post on AI and insurance referred to some of the changes ongoing in financial services in relation to core business processes and costs. This industry article highlights how machine learning (specifically multi-objective genetic algorithms) can be used in portfolio optimization by (re)insurers. To further my understanding on the topic, I recently bought a copy of a new book called “Advances in Financial Machine Learning” by Marcos Lopez de Prado, although I suspect I will be out of my depth on the technical elements of the book. Other posts on this blog (such as this one) on the telecom sector refer to the impact intelligent networks are having on telecom business models. One example is the efficiencies Centurylink (CTL) have shown in their capital expenditure allocation processes from using AI and this got me thinking about the competitive impact such technology will have on costs across numerous traditional industries.

AI is a complex topic and in its broadest context it covers computer systems that can sense their environment, think, and in some cases learn, and take applicable actions according to their objectives. To illustrate the complexity of the topic, neural networks are a subset of machine learning techniques. Essentially, they are AI systems based on simulating connected “neural units” loosely modelling the way that neurons interact in the brain. Neural networks need large data sets to be “trained” and the number of layers of simulated interconnected neurons, often numbering in their millions, determine how “deep” the learning can be. Before I embarrass myself in demonstrating how little I know about the technicalities of this topic, it’s safe to say AI as referred to in this post encompasses the broadest definition, unless a referenced report or article specifically narrows the definition to a subset of the broader definition and is highlighted as such.

According to IDC (here), “interest and awareness of AI is at a fever pitch” and global spending on AI systems is projected to grow from approximately $20 billion this year to $50 billion in 2021. David Schubmehl of IDC stated that “by 2019, 40% of digital transformation initiatives will use AI services and by 2021, 75% of enterprise applications will use AI”. By the end of this year, retail will be the largest spender on AI, followed by banking, discrete manufacturing, and healthcare. Retail AI use cases include automated customer service agents, expert shopping advisors and product recommendations, and merchandising for omni channel operations. Banking AI use cases include automated threat intelligence and prevention systems, fraud analysis and investigation, and program advisors and recommendation systems. Discrete manufacturing AI use cases including automated preventative maintenance, quality management investigation and recommendation systems. Improved diagnosis and treatment systems are a key focus in healthcare.

In this April 2018 report, McKinsey highlights numerous use cases concluding that ”AI can most often be adopted and create value where other analytics methods and techniques are also creating value”. McKinsey emphasis that “abundant volumes of rich data from images, audio, and video, and large-scale text are the essential starting point and lifeblood of creating value with AI”. McKinsey’s AI focus in the report is particularly in relation to deep learning techniques such as feed forward neural networks, recurrent neural networks, and convolutional neural networks.

Examples highlighted by McKinsey include a European trucking company who reduced fuel costs by 15 percent by using AI to optimize routing of delivery traffic, an airline who uses AI to predict congestion and weather-related problems to avoid costly cancellations, and a travel company who increase ancillary revenue by 10-15% using a recommender system algorithm trained on product and customer data to offer additional services. Other specific areas highlighted by McKinsey are captured in the following paragraph:

“AI’s ability to conduct preventive maintenance and field force scheduling, as well as optimizing production and assembly processes, means that it also has considerable application possibilities and value potential across sectors including advanced electronics and semiconductors, automotive and assembly, chemicals, basic materials, transportation and logistics, oil and gas, pharmaceuticals and medical products, aerospace and defense, agriculture, and consumer packaged goods. In advanced electronics and semiconductors, for example, harnessing data to adjust production and supply-chain operations can minimize spending on utilities and raw materials, cutting overall production costs by 5 to 10 percent in our use cases.”

McKinsey calculated the value potential of AI from neural networks across numerous sectors, as per the graph below, amounting to $3.5 to $5.8 trillion. Value potential is defined as both in the form of increased profits for companies and lower prices or higher quality products and services captured by customers, based off the 2016 global economy. They did not estimate the value potential of creating entirely new product or service categories, such as autonomous driving.

click to enlarge

McKinsey identified several challenges and limitations with applying AI techniques, as follows:

  • Making an effective use of neural networks requires labelled training data sets and therefore data quality is a key issue. Ironically, machine learning often requires large amounts of manual effort in “teaching” machines to learn. The experience of Microsoft with their chatter bot Tay in 2016 illustrates the shortcoming of learning from bad data!
  • Obtaining data sets that are sufficiently large and comprehensive to be used for comprehensive training is also an issue. According to the authors of the book “Deep Learning”, a supervised deep-learning algorithm will generally achieve acceptable performance with around 5,000 labelled examples per category and will match or exceed human level performance when trained with a data set containing at least 10 million labelled examples.
  • Explaining the results from large and complex models in terms of existing practices and regulatory frameworks is another issue. Product certifications in health care, automotive, chemicals, aerospace industries and regulations in the financial services sector can be an obstacle if processes and outcomes are not clearly explainable and auditable. Some nascent approaches to increasing model transparency, including local-interpretable-model-agnostic explanations (LIME), may help resolve this explanation challenge.
  • AI models continue to have difficulties in carrying their experiences from one set of circumstances to another, applying a generalisation to learning. That means companies must commit resources to train new models for similar use cases. Transfer learning, in which an AI model is trained to accomplish a certain task and then quickly applies that learning to a similar but distinct activity, is one area of focus in response to this issue.
  • Finally, one area that has been the subject of focus is the risk of bias in data and algorithms. As bias is part of the human condition, it is engrained in our behaviour and historical data. This article in the New Scientist highlights five examples.

In 2016, Accenture estimated that US GDP could be $8.3 trillion higher in 2035 because of AI, doubling growth rates largely due to AI induced productivity gains. More recently in February this year, PwC published a report on an extensive macro-economic impact of AI and projected a baseline scenario that global GDP will be 14% higher due to AI, with the US and China benefiting the most. Using a Spatial Computable General Equilibrium Model (SCGE) of the global economy, PwC quantifies the total economic impact (as measured by GDP) of AI on the global economy via both productivity gains and consumption-side product enhancements over the period 2017-2030. The impact on the seven regions modelled by 2030 can be seen below.

click to enlarge

PwC estimates that the economic impact of AI will be driven by productivity gains from businesses automating processes as well as augmenting their existing labour force with AI technologies (assisted, autonomous and augmented intelligence) and by increased consumer demand resulting from the availability of personalised and/or higher-quality AI-enhanced products and services.

In terms of sectors, PwC estimate the services industry that encompasses health, education, public services and recreation stands to gain the most, with retail and wholesale trade as well as accommodation and food services also expected to see a large boost. Transport and logistics as well as financial and professional services will also see significant but smaller GDP gains by 2030 because of AI although they estimate that the financial service sector gains relatively quickly in the short term. Unsurprisingly, PwC finds that capital intensive industries have the greatest productivity gains from AI uptake and specifically highlight the Technology, Media and Telecommunications (TMT) sector as having substantial marginal productivity gains from uptaking replacement and augmenting AI. The sectoral gains estimated by PwC by 2030 are shown below.

click to enlarge

A key element of these new processes is the computing capabilities needed to process so much data that underlies AI. This recent article in the FT highlighted how the postulated demise of Moore’s law after its 50-year run is impacting the micro-chip sector. Mike Mayberry of Intel commented that “the future is more heterogeneous” when referring to the need for the chip industry to optimise chip design for specific tasks. DARPA, the US defence department’s research arm, has allocated $1.5 billion in research grants on the chips of the future, such as chip architectures that combine both power and flexibility using reprogrammable “software-defined hardware”. This increase in focus from the US is a direct counter against China’s plans to develop its intellectual and technical abilities in semiconductors over the coming years beyond simple manufacturing.

One of the current leaders in specialised chip design is Nvidia (NVDA) who developed software lead chips for video cards in the gaming sector through their graphics processing unit (GPU). The GPU accelerates applications running on standard central processing units (CPU) by offloading some of the compute-intensive and time-consuming portions of the code whilst the rest of the application still runs on the CPU. The chips developed by NVDA for gamers have proven ideal in handling the huge volumes of data needed to train deep learning systems that are used in AI. The exhibit below from NVDA illustrates how they assert that new processes such as GPU can overcome the slowdown in capability from the density limitation of Moore’s Law.

click to enlarge

NVDA, whose stock is up over 400% in the past 24 months, has been a darling of the stock market in recent years and reported strong financial figures for their quarter to end April, as shown below. Their quarterly figures to the end of July are eagerly expected next month. NVDA has been range bound in recent months, with the trade war often cited as a concern with their products sold approximately 20%, 20%, and 30% into supply chains in China, other Asia Pacific countries, and Taiwan respectively

click to enlarge

Although seen as the current leader, NVDA is not alone in this space. AMD recently reported strong Q1 2018 results, with revenues up 40%, and has a range of specialised chip designs to compete in the datacentre, auto, and machine learning sectors. AMD’s improved results also reduce risk on their balance sheet with leverage decreasing from 4.6X to 3.4X and projected to decline further. AMD’s stock is up approximately 70% year to date. AMD’s 7-nanonmeter product launch planned for later this year also compares favourably against Intel’s delayed release date to 2019 for its 10-nanometer chips.

Intel has historically rolled out a new generation of computer chips every two years, enabling chips that were consistently more powerful than their predecessors even as the cost of that computing power fell. But as Intel has run up against the limits of physics, they have reverted to making upgrades to its aging 14nm processor node, which they say performs 70% better than when initially released four years ago. Despite advances by NVDA and AMD in data centres, Intel chips still dominate. In relation to the AI market, Intel is focused on an approach called field-programmable gate array (FPGA) which is an integrated circuit designed to be configured by a customer or a designer after manufacturing. This approach of domain-specific architectures is seen as an important trend in the sector for the future.

Another interesting development is Google (GOOG) recently reported move to commercially sell, through its cloud-computing service, its own big-data chip design that it has been using internally for some time. Known as a tensor processing unit (TPU), the chip was specifically developed by GOOG for neural network machine learning and is an AI accelerator application-specific integrated circuit (ASIC). For example, in Google photos an individual TPU can process over 100 million photos a day. What GOOG will do with this technology will be an interesting development to watch.

Given the need for access to large labelled data sets and significant computing infrastructure, the large internet firms like Google, Facebook (FB), Microsoft (MSFT), Amazon (AMZN) and Chinese firms like Baidu (BIDU) and Tencent (TCEHY) are natural leaders in using and commercialising AI. Other firms highlighted by analysts as riding the AI wave include Xilinx (XLNX), a developer of high-performance FPGAs, and Yext (YEXT), who specialise in managing digital information relevant to specific brands, and Twilio (TWLO), a specialist invoice and text communication analysis. YEXT and TWLO are loss making. All of these stocks, possibly excluding the Chinese ones, are trading at lofty valuations. If the current wobbles on the stock market do lead to a significant fall in technology valuations, the stocks on my watchlist will be NVDA, BIDU and GOOG. I’d ignore the one trick ponys, particularly the loss making ones! Specifically, Google is one I have been trying to get in for years at a sensible value and I will watch NVDA’s results next month with keen interest as they have consistently broken estimates in recent quarters. Now, if only the market would fall from its current heights to allow for a sensible entry point…….maybe enabled by algorithmic trading or a massive trend move by the passives!