Join our community of smart investors

How to play the artificial intelligence boom

Tech stocks are heavily investing but it's becoming clear who will win out
April 27, 2023

A general-purpose technology is one that impacts the whole economy. The invention of the steam engine, for example, changed what people consumed, how they travelled and where they lived. Centuries later, after a variety of modern technologies have had significant impacts of their own, today it is artificial intelligence (AI) for which the biggest promises are being made.

Strange as it may sound, there are potential parallels, in terms of both efficiency gains and investment manias between steam power and AI. In 1698, the original steam pump was patented by Thomas Savery but it wasn’t until James Watt patented the improved Boulton-Watt engine in 1769 that steam started to transform the economy.

Factories had previously been powered by horses, wind or water, which imposed physical restrictions on where they could be located. The clustering enabled by the use of steam power, combined with the invention of the train, allowed the efficient transfer of materials, goods and ideas between hubs. A flywheel effect was set in motion. Between 1800 and 1900, UK gross domestic product (GDP) per capita rose 109 per cent from £2,331 to £4,930 (the figures are adjusted for 2013 prices).

With the advent of the first commercial railway in 1830, public interest soared. Private firms were launched promising to build tracks all over the country, adverts for railways were plastered over newspapers and, in 1845, parliament authorised 3,000 miles of railway, about as much as in the 15 previous years combined. The British railway share price index more than doubled between 1844 and 1846, much as AI-related shares are now taking off.

But like any investment bubble it utlimately burst. Demand was not enough to keep pace with supply. Investors, including Charles Darwin and Charlotte Brontë, were left holding the bag. “The original price of shares in this railway was £50. At one time rose to £120; for some years gave a dividend of 10 per cent; they are now down to £20,” wrote Brontë, the author of Jane Eyre.

AI investment mania

At the end of last year, the generative AI chatbot ChatGPT, brought an exciting new technology into the public eye. Within months, the service, owned by Microsoft-backed OpenAI, had more than 100mn users, making it the fastest-growing consumer software product ever made.  

As with the steam engine, there is an intuitive sense that AI will change the world. But it is not yet obvious exactly how. Text-to-image generators are being used by advertisers, video game designers and film production teams. Chatbots are being integrated into search engines, while almost all enterprise software companies are trying to integrate AI into their product in some shape or form. This is likely to be just the start.

There is now a vast number of companies promising to use AI to revolutionise their businesses. At the end of January, Buzzfeed’s (US:BZFD) share price jumped over 200 per cent after it announced it would be using AI to write articles. Those gains were particularly shortlived, disappearing within a week. Last week it closed its news division and fired 15 per cent of its workforce.

A lot of money is already chasing the promise of AI, but it is not always first movers that end up flourishing. That said, big companies tend to have an advantage.

In the past two years, big tech companies have invested heavily in computing and in GPUs in particular. Last year, Amazon (US:AMZN) spent $64bn on capital expenditure (capex), up 59 per cent from 2020. At the same time, Meta’s (US:META) was up 108 per cent, Microsoft's (US:MSFT) 55 per cent and Alphabet's (US:GOOGL) 41 per cent. Oracle (US:ORCL) boosted its capex by 188 per cent. Given the diverse nature of these companies, computing will only account for a portion of this trend. However, there is no doubt that the tech giants are dominating the market. Consultant TrendForce estimates that in 2022, 66 per cent of all AI chips were acquired by either Microsoft, Google, Meta or Amazon. These companies have an effective oligopoly on computing power. By contrast, Apple (US:AAPL) has showed no signs of getting involved in the AI race.

 

 

There are other ways in which big tech is asserting its dominance beyond expenditure. Most AI companies don't have the cash to build their own servers, meaning they have to rent computing power from the tech giants to run their models. As part of Microsoft's $10bn (£8.05bn) investment in OpenAI last year, Microsoft gave the company access to its supercomputer. Google Cloud has a similar deal with Anthropic, and Amazon just announced a partnership with Stability AI. 

 

Picks and shovels

But the company that has benefited most from the AI investment hype so far is graphics processing unit (GPU) designer Nvidia (US:NVDA). Its share price is up 89 per cent year-to-date, meaning it is currently the seventh most valuable company in the world.

GPUs allow faster computing because they use matrix calculations rather than linear calculations. This is known as parallel computing. To process this much information, the GPUs need hundreds of 'cores' as opposed to the handful that sit on central processing units (CPUs). Nvidia’s GPUs were initially used for gaming, but they also run machine learning algorithms. If supercomputers are the railways of this technological revolution, then GPUs are the steam engine.

In simple terms, the more GPUs that can be strung together, the more computing power there is. The largest computers will have around 10,000 GPUs. Each of the newest Nvidia H100 GPUs will cost $25,000, meaning the whole computer could cost as much as $250mn.

The huge investment in computing power means Nvidia’s largest revenue segment is now data centres. Last year, the company made $27bn in revenue, 56 per cent of which came from data centres, 34 per cent from gaming and 6 per cent from visualisation. Nvidia's strength is that it doesn’t just design GPUs. In 2019, it acquired Mellanox, which makes InfiniBand cables used to connect servers to one other, for $6.9bn. On top of this, Nvidia invented the Cuda coding language needed to write programs to run its GPUs. Once coders learn this language, they are effectively locked into the Nvidia ecosystem.

Nvidia’s value has shot up to the extent that its forward price/earnings ratio stands at 60 times. Most analysts are backing the run to continue, even at these levels: 19 of the 31 analysts covering Nvidia still have the company on a 'buy'.

Here, too, big tech rivals are attempting to muscle in. Nvidia has had a big head start in parallel computing, but the rising costs of its GPUs are pushing the most well-known tech players to design their own chips. Alphabet already uses tenor processing units (TPUs) in its own computer. At the start of April, Google released a research paper saying its supercomputer, which strung together 4,000 of its TPUs, is 1.7 times faster than an equivalent machine running on Nvidia A100 GPUs, and 1.9 times more efficient.

These results should be taken with a pinch of salt given the speed of development in the industry. Nvidia says its own new H100 GPU is several times faster than the A100 against which Google was comparing its own offering. 

Microsoft is also developing its own parallel computing chip code-named “Athena”, according to tech website The Information. Amazon, another of Nvidia’s customers, already uses its own Trainium chips as well as Nvidia’s GPUs to train its machine-learning models.

 

 

Search engine battle

Another example of how this increasingly heated competition is manifesting itself is search engines. Google has dominated search for a long time, but the AI age could shift the balance. Generative AI's potential ability to answer queries more quickly and more accurately than traditional search engines, many of which are arguably less effective than they once were at producing desired results, has the potential to spark change. In April, The New York Times reported that Samsung was considering replacing Google as the default search engine on its mobile devices with Microsoft Bing. Microsoft stole a march on Google when it swiftly integrated ChatGPT’s model into Bing in the hope of stealing market share.

Microsoft chief executive Satya Nadella is newly optimistic in his attempts to dethrone Google. “From now on, the gross margin of search is going to drop forever,” he said in an interview with the Financial Times in February. “For us it is incremental. For Google it is not, they have to defend it all”.

Google has long been examining the power of AI. It invented recurrent neural networks, a machine learning technique used in Google voice search and Apple's Siri and which is used by all generative AI models today. In 2017, its paper ‘Attention Is All You Need’ showed that neural networks require less computational power to train and were a much better fit for modern machine learning hardware. Google was the pioneer in this space, but recent events have gone against it. Two years ago, Alphabet released its own chatbot, LaMDA, the basis for the Bard chatbot that will soon be integrated into its search engine. It should have had first-mover advantage, but it was caught out by the timing of ChatGPT’s release.

In March, Alphabet launched its chatbot, Bard, in the UK and US. Amid suggestions that the service was less advanced, Bard did not gain the same kind of traction as ChatGPT’s initial launch. The perception is that Alphabet is lagging Microsoft, and not just when it comes to AI. The two companies trade on 2025 price/earnings (PE) ratios of 14 and 23, respectively. Alphabet is being dragged down by the advertising market, which accounts for 70 per cent of its revenue. However, given its market-leading position in AI computing and software, its share price looks affordable. The company is currently trading at a 24 per cent discount to Jefferies’ $130 target price.

 

Enterprise customers

So far, Amazon has stayed out of the spotlight. However, with the largest cloud computing business in the world it will play a big part in AI's future. Rather than going after consumers – in the way Google and Microsoft’s search engines are – Amazon is aiming to attract enterprise customers. Amazon signed partnerships with Stability AI and Anthropic which will give its Amazon Web Services customers access to these products.

Stability AI is a London-based text-to-image generator. This reduces the cost of art generation and has uses in industries such as advertising and video game design. Anthropic is a text generator with one product specifically focused on the legal sector. There are other big tech connections, too: Anthropic was founded by previous employees of OpenAI, and Google recently acquired a 10 per cent stake in the business for $300mn.

While consumer products such as ChatGPT have caught the headlines, a lot of the value of AI is anticipated to be generated by integrating it into enterprise software. If AI is to be a productivity-enhancing product in the way that its advocates hope, it is this area in which the benefits will emerge.

Other tech giants have their own plans. Microsoft announced it will be enhancing Office 365 with generative AI tools. It will now be built into Excel, Word and PowerPoint, while Alphabet will be doing something similar with its Google Workspace. Salesforce (US:CRM) has launched Einstein, a service that will automate marketing copy and images, as well as generating sales leads and writing emails.

One rival is taking a different approach to enterprise improvements. Meta is the only big tech company to make its large language model open source and chose to publish its LLaMA (Large Language Model Meta AI). In a blog post, Meta said it wanted to “enable others in the research community who don’t have access to large amounts of infrastructure to study these models, further democratizing access in this important, fast-changing field”.

The huge costs associated with the development of these models has condensed the most advanced technology in the hands of a few companies. By publishing its LLaMA – which is developed using a smaller number of parameters than ChatGPT – Meta is giving other companies a better chance of competing with the likes of Google's and Microsoft’s chatbots.

Meta isn’t concerned about increased chatbot competition because it doesn’t compete in the search engine market. Instead, it too is targeting its enterprise customers. It plans to use AI to boost user engagement and increase returns for advertisers. The company has more than 3bn monthly active users, which gives it a huge data set to train on. “All of our AI capacity is going towards ads, [social media] feeds and [Instagram] reels and where we have been able to deploy GPU clusters at scale,” said chief financial officer Susan Li last month.

There is a belief that Meta’s social media products are becoming less popular, but most recent data does not back this up any longer. The ratio of daily active users to monthly active users has risen from 65.8 per cent in 2021 to 67.5 per cent last quarter. “Meta’s mega investments in AI hardware and software are improving content and ads serves on the platform, increasing engagement and revenue,” says Jefferies analyst Brent Thill.

This isn’t coming cheap, though. Meta has acknowledged it is behind the pack in terms of GPU investment and is spending to catch up. Capex as a percentage of revenue rose to 27 per cent in 2022, up from 16 per cent in 2021; much of this was focused on metaverse spending, but GPUs are required here too, and the company is now refocusing on AI along with the rest of the industry. Li acknowledges GPU investment will cost “a lot” of money but Meta has no plans to slow down. “The majority of our server spend in 2023 is going towards GPUs and the new datacenter architecture is primarily in service of deploying those GPUs,” she said on a recent call with analysts.

In short, the big tech companies are dominant in the space because they are the only businesses with enough free cash flow to afford the GPUs needed. For others to compete will require billions of dollars of capital investment. On a valuation basis, Meta and Alphabet are currently the cheapest of the four players mentioned above (see table below) – in part because of their large exposure to the struggling advertising market. But their capital expenditure puts a significant moat around their businesses when it comes to AI. 

 

CompanyMarket cap ($bn)2022 sales ($bn)Sales 5-year CAGRPE NTMPrice to salesCapex 5 year CAGR (%)Operating marginOperating cash flow ($bn)ROCE (%)
Amazon1,06451424.1%57.21.739.72.646.7-1.9
Alphabet1,35228121.0%19.44.419.025.991.523.6
Meta55211721.8%19.72.836.128.850.518.5
Microsoft2,13019816.0%27.39.824.142.189.047.2

 

GPU bottleneck  

Language models get better by increasing in size. GPT-4, the latest system powering ChatGPT, was trained on tens of thousands of GPUs, and the forthcoming GPT-5 is reportedly being trained on 25,000 Nvidia GPUs. TrendForce estimates that the GPU market will grow at a compound annual growth rate of 10.8 per cent between 2022 and 2026 as companies scale up capacity to meet demand.

This lack of hardware could put a limit on the number of models that can be trained. “If GPU performance trends continue until 2030 and world GPU production doesn’t increase by a lot, the maximum amount of 'compute' in an AI model can’t grow more than three orders of magnitude by the end of the decade,” Sarah Constantin, director of corporate development at Nanotronics, a chip manufacturer, stated in a blog post this month.

The shortage of supply enables Nvidia to charge a high price for its GPUs. Analysts are expecting its operating margin to rise to 41 per cent next year, compared with a five-year average of 29 per cent, and its market capitalisation of $689bn now exceeds that of Meta. It’s these high prices that have pushed Amazon, Microsoft and Alphabet to all design their own parallel computing chips.

Whether any of their chips will be able to compete with Nvidia is yet to be seen. However, with more companies licensing, the price of parallel computing chips and therefore Nvidia's margins are likely to drop back in future.

The main beneficiaries of this competition will be the chip fabricators and equipment manufacturers. TSMC (TW:2330) makes 90 per cent of the world’s advanced logic chips and includes Amazon, Microsoft, Alphabet and Nvidia as its customers. 

Analysts are expecting TSMC’s earnings per share to drop this year because of the fall in demand for consumer electronics. However, FactSet broker consensus expects it to rise to $1.48 by 2025, from a current level of $1.29, giving a three-year forward PE ratio of just 11. “TSMC generates significant free cash flow and trades on a remarkably low valuation for a business of this quality,” says Baillie Gifford emerging market specialist Andrew Keiller.

The AI market is already coalescing into the hands of a few big tech companies. The huge capital costs needed make it difficult to compete, especially with financing costs now rising. Meta, Amazon, Alphabet and Microsoft have been able to recycle their billions of dollars of free cash flow into computer investment over recent years.

Arguably the biggest potential obstacle is regulation. Noises about clipping big tech's wings have grown louder in recent years, albeit with little impact on bottom lines so far. That noise will only increase as and when AI becomes more prominent, both in the US and in domestic markets. This month, Ofcom called for a probe into Microsoft and Amazon’s dominance of the UK’s cloud computing market. It is “particularly concerned” by Amazon and Microsoft, which hold between 60 and 70 per cent of the UK’s cloud market. Even in this digital age, it is the tangible assets – ie the computers and the GPUs that power the cloud – that have given these companies an almost unassailable position.

In the UK, the railway bubble burst in the 1840s because companies committed to building tracks that were never in the end needed. In the US, it was a different story: a few dominant companies were able to emerge from the process. By the 1860s, Cornelius Vanderbilt’s railroad companies owned the majority of US railways. They were eventually broken up by the government at the start of the next century. But that wasn’t until Vanderbilt had amassed a fortune of over $150bn in today’s money.