Join our community of smart investors

Quantum computing: a new industrial revolution

Quantum computing is a massive leap forward for medicine, logistics, and finance but poses serious questions for cybersecurity and cryptocurrencies.
September 5, 2022
  • Some developers are on the cusp of reaching 'quantum advantage', a crucial breakthrough
  • Early-adopters in the real economy include Goldman Sachs, JP Morgan, BMW, ExxonMobil and Roche
  • 'Y2Q' cybersecurity deadline looms for existing internet encryption standards

Investors who were active around the turn of the millennium will recall two particular tech lessons. One was to be cynical about pre-revenue businesses in the sector. The other relates to the excess panic over the mooted Y2K bug, when it was feared anomalies in calendar programming would crash systems and cause planes to fall from the sky. In the event, partly thanks to plenty of work behind the scenes, nothing of the sort happened.

Should you, therefore, dismiss the hype of a technological arms race in quantum computing? America and China are pumping billions of dollars into the space, with an eye on the prospect of using its problem-solving potential to crack one another’s security encryption – and European nations are following suit. But will this really translate into something tangible and worthwhile for investors?

Consider the fact corporations like BMW, Goldman Sachs, JP Morgan, Roche, LG Electronics, ExxonMobil and Mitsubishi Chemicals are seriously investigating quantum computing solutions of their own, and doubts may waiver. Potential applications in a variety of industries make adoption more likely.

Keen observers will also have noted the spinning out by Alphabet (US:GOOGL) of its Sandbox AQ division (the A stands for Artificial Intelligence and the Q for Quantum Computing) in March 2022. Achieving a nine-figure capital raise suggests this project is moving to a new phase of growth that venture capitalists are prepared to fund, even if quarterly obsessed public markets would baulk at the expense.

“Being an independent company gives us the ability to grow rapidly, innovate and capture a leadership position in the fast-moving quantum ecosystem," says Sandbox AQ chief executive Jack Hidary.

For private investors, the options are growing. While public stock exchanges offer few pure plays on quantum computing, there have been some reverse listings in the US via special purpose acquisition companies (Spacs). Since the final quarter of 2021, Rigetti (US:RGTI) and Ion-Q (US:IONQ) have gone public in this manner and D-Wave completed its merger with the Spac DPCM Capital to list as D-Wave Quantum (US:QBTS) earlier this month.  

But older companies, which have the advantage of other established revenue lines, stand to benefit, too. Commercially viable solutions would potentially be transformative for IBM (US:IBM), which is at the forefront of quantum computing.

Then there are adopter firms that foresee a competitive advantage in mastering this technology which promises a new, faster approach to overcome computing challenges. Market-sizing the opportunity is difficult and Chirag Dekate, consultant at Gartner Research, likens commercial progress to the first five minutes of a marathon. Although annual spending by potential end-user companies has risen steadily in the last five years, it goes without saying that the $3mn (£2.5mn) average Dekate cites must grow further to indicate quantum is truly a game-changer.

The flipside of these small figures, however, is that we are too early for quantum upside to be priced in. Effectively, quantum is on the table for free in cases where innovative companies are already cheap on a sum of parts valuation.

 

Capturing decades of scientific ingenuity

What these businesses are actually doing is, inevitably, complex. Even legendary American physicist Richard Feynman once declared “nobody truly understands quantum mechanics”. But that hasn’t discouraged another generation of brilliant minds from pushing boundaries to try to offer a better explanation of the universe.

One branch of this awesome endeavour is development of supercomputers able to crack problems that are intractable today. As with other applications of quantum mechanics, progress has been decades in the making and breakthroughs are tantalisingly close. Yet the science still strikes the lay person as surreal.

In that context it may be easiest to start by defining what quantum computing is not, says Tony Uttley, COO and president of Quantinuum , a quantum computing business majority owned by US industrial giant Honeywell (US:HON): “It’s not just a faster way to compute, it’s a different way to compute.”

Whereas classical computers perform linear computation with binary digits (bits) as the standard units of information, quantum bits (qubits) solve problems by harnessing the properties of space and time. Scientists have focused on three of these in particular: superposition, entanglement and interference.  

The job of further definition is also best left to experts: “Quantum computing is an entirely new type of computation that instead of relying on the laws of classical physics, relies on the laws of quantum physics”, summarises Dr Olivia Lanes of IBM.

“What it boils down to”, says Lanes, “is the fact you can entangle the computational bits – what we call the qubits – of the processor of the computer in such a way that the informational space grows exponentially instead of linearly, and so there are certain algorithms and certain computations that can be done much more efficiently and much faster on a quantum computer versus a classical computer. In theory.”

 

Superposition, entanglement and interference

“Superposition”, explains Uttley, is “where the qubit can be both a one and a zero at the same time.”

That is a revolutionary leap from classical bits. “When you think of a bit, you think of something that’s binary, like a light switch,” says Lanes, “It’s 1, it’s 0: you can’t really get anything more. But a qubit is like a 3D informational space.”

“So, you can have north pole, which is 1, the south pole which is 0, and then any point on the surface of the sphere is also a specific state.”

“When you couple spheres together, the informational space where you can perform computation grows exponentially. Whereas when you have one bit plus another bit, plus another bit, that only grows linearly.”

Bringing two qubits together in this way is known as entanglement. “They carry information about each other over both time and space as you do more of the computation”, says Uttley. Remarkably, entanglement is a property that remains even at large distances.

Yet quantum states are enigmatic and collapse as soon as they are observed or measured, which means the third property – interference – is crucial to harness problem-solving capabilities. “Interference”, expounds Uttley, “is an incredible property that allows us to take advantage of looking at a problem and looking at a whole bunch of answers, then using the incorrect answers to interfere with each other so that you are only left with correct answers.”

 

Possibilities abound

Together, these qualities place solutions to certain types of problem within the grasp of supercomputers. Quite simply, it’s exponentially more efficient than flipping a coin multiple times then plugging information into more calculations, as is the case with classical bits. Therefore, for any industry where solving probabilistic problems is important, the application of quantum technology could be a revelation. And there are many sectors where that is the case.

Unsurprisingly, one particular part of the financial services industry has high hopes of being part of the quantum vanguard.

Properly estimating parameters of risk has always challenged portfolio managers and investors, not to mention banks whose inadequate models were rightly pilloried in the wake of the global financial crisis. Prior to 2008, many financial models relied on value at risk (VaR), a measure with the inherent flaw of assuming daily asset returns are normally distributed (ie when presented in a histogram they plot a perfect symmetrical bell curve, peaking at the mean value).

While VaR computes an average loss for the worst five, one or even 0.5 per cent of days, it woefully underplays the likelihood and magnitude of the very worst days. Progress on risk modelling has been made since the financial crisis, but far more is possible with quantum computing.

For instance, Monte Carlo simulations (mathematical models that rely on the power of flexing inputs of random numbers) can include exponentially more variables if the probabilities of outcomes are accurately computed with qubits. That means a fuller picture of potential causes and effects can be reflected when estimating risk. It could even help with modelling the impact of climate change on portfolios.

Other industries directly in the spotlight of the clean energy transition are finding use for quantum computing, too. Oil & gas giant ExxonMobil (US:XOM) is one of the firms partnering with IBM, specifically using quantum algorithms to tackle challenges shipping liquified natural gas (LNG), a pertinent problem in light of the energy crisis and Russia’s war on Ukraine.

Handling the logistics of delivery requires accounting for the position of every ship on each day of the year and factoring in millions of discreet decisions. Adding variables such as weather and demand fluctuations increases that number into the billions or even trillions. Quantum computing enables processing and understanding these minute, complex and interconnected variables to potentially make for a smoother and more efficient supply chain. 

Car maker BMW (DE:BMW) is using the testing capabilities afforded by quantum computers to help with its complex supply chains and engineering challenges. Pharmaceutical companies such as Roche (CH:ROG) are making use of the technology to assess the impact of millions of tiny variables on experiments. Battery and home appliance manufacturer LG (KR:066570) is doing likewise to test and develop their products.

The overlap between quantum computing and artificial intelligence learning is particularly intriguing: “Perhaps most exciting of all is the potential for researchers to derive new inspiration from the fundamentally different ways in which quantum computation operates, and apply these insights to completely classical AI which can be utilised in the near term,” says Sandbox AQ’s Hidary.  

For cybersecurity, which would be turned on its head by successful quantum technology, Hidary thinks machine learning – a process already employed by traditional computers – will be critical to enabling systems to “adapt to emerging threats and implement, in real time, the best, most appropriate cybersecurity algorithms.”

Quantum computers’ ability to test thousands of permutations, coupled with machine learning’s ability to identify patterns, can also be applied to pharmaceutical testing. Ultimately, the improved prediction of negative or dangerous outcomes can safely speed up drug development and reduce costs.

 

Supremacy or advantage?

Quantum ‘advantage’ and ‘supremacy’ have been used interchangeably, but there are differences. The term quantum supremacy refers to developing computers powerful enough to complete tasks in feasible timeframes beyond the reach of classical computers – but so far has meant solving niche mathematical problems for the self-fulfilling end of demonstrating computational power and speed.

Various claims of quantum supremacy have been made down the years, typically emphasising the number of qubits effectively deployed. Back in 2019 Alphabet announced it had achieved quantum supremacy with a 54-qubit machine. In October 2021 the state-backed University of Science and Technology of China claimed to be top dog with 60-qubits. This year, the crown is with IBM which has a 127-qubit capability.

IBM’s roadmap has an ornithological theme, with its 127-qubit stage post code-named “Eagle” and its next 433-qubit objective called “Osprey”. Next comes “Condor” which is 1,121 qubits, scheduled for 2023-24, which Lanes believes will be a true watershed.

“Condor is the most important threshold processor. That’s the point we achieve quantum advantage – the point at which quantum computers will be able to do something useful that classical computers cannot do,” he says

Quantum advantage, therefore, can be interpreted as being when problem-solving power can be applied to real world issues, which makes it much more interesting for investors.

 

Not all qubits are created equal

The manner in which those goals are reached is also up for debate. With a plethora of uses, specific end-user requirements differ – and there are certain schools of thought on which approach to quantum computing is most appropriate depending on problems being tackled. Methods used can rely on a variety of technologies including superconductivity, semiconducting, photonics (the science of light waves), and 'trapped ion' technology – a method where qubits are stored in the stable electronic states of ions (electrically charged particles) that have been confined and suspended using electromagnetic fields.

“IBM focusses on making processors out of superconductors and what we call Transmon qubits”, says Lanes. Infrastructure in nanotechnology (the science of manipulating matter at atomic scale) to make these circuits already exists and, unlike semiconductors, there aren’t supply concerns: many leading producers of superconductors are based in the United States or Japan.

“Transmons have been shown to be really adept in terms of coupling with one another”, says Lanes, adding that the technology enables fast control.

For superconducting technologies, the speed in the computation step between two qubits is very fast (a matter of nanoseconds). But Uttley points out that this method is less adept at keeping information stable than trapped ion processes. The latter, on the other hand, are slower. “In all of these technologies, we’re making trade-offs.”

Deciding what approach to use might depend on how many qubits need to talk to one another. For instance, Uttley says certain types of chemistry experiment that require multiple qubits to arbitrarily communicate might be better suited to trapped ion solutions, where qubit connectivity can be maintained for around 12 minutes. For problems where qubits need to speak mainly with their nearest 'neighbour', superconducting may be better.

Achieving optimal speed and maintaining the stability of quantum states (what is known as their coherence – to ‘decohere’ is to fall back into a ground state) for the longest time are ongoing challenges. They are as important as the number of qubits. As the above claims of quantum supremacy indicate, this latter measurement has been viewed as ostensibly the measure of sophistication, but this isn’t strictly correct.

“You can have a million bad qubits and that still won’t do anything useful. The other important things are the speed, the quality and then the scale as well," says Lanes.

Quality is measured by something called the quantum volume: the depth of the circuits that you can run scales with the processor. IBM has also introduced a marker called CLOPS (circuit layer operations per second) for speed.

“All of these tied together are really important and we have to have all three of them [speed, quality and power] really strong in order to achieve quantum advantage," says Lanes.

Error correction is worth emphasising as it is essential to preserving quantum states and generating the highest quality and most useful qubits. “All qubits have a T1 lifetime, the amount of time before they decohere back into the ground state," Lanes adds.

In most cases, and certainly in those where superconducting is the underlying technology, that T1 time is to the order of microseconds.  “[You] need to be able to perform your algorithm and your computational time faster than the T1 time will decohere. You can’t perform complicated algorithms in that amount of time, so you have to introduce error correction,” says Lanes. IBM's Condor processor intends to begin incorporating this ability.

“Once error correction is in place and it’s working,” continues Lanes, “it doesn’t necessarily matter how long that information can live, because when an error occurs, we can find it and account for it and fix it.”

Another issue is something known as leakage: when the qubit gets too energetically excited and “leaks” into higher energy states, making it harder to error correct. “We need to make sure that our leakage rates and correlated error rates are low as we scale up the processors," adds Lanes.

 

Quantum circuits

The usefulness of quantum applications can also depend on how elegantly the qubits at an operation’s disposal are mapped to a given problem.

“As you think about what you do with every qubit at what time position, it [manifests] as something we call a circuit," says Uttley.

“Think of it as a sheet of music: played on multiple notes over multiple areas of time, as you keep these qubits coherent with quantum information. Each algorithm is fit for purpose designed to be able to solve a particular type of problem: it could be a chemistry problem, it could be a machine learning problem, it could be a search type of  problem.”

One of Uttley’s colleagues, Dr. Chad Edwards, expands on the intricacies of setting up quantum circuits, citing three key stages. The first is preparation of the quantum states and initialisation; the second is operations comprising whatever quantum logic gate (the basic circuit model) is best suited to the problem; the third is the measurement phase – the gathering of useful outputs. 

 

Over-hyped or unacceptable threat?

There is more to supremacy, then, than just the number of qubits. The word ‘supremacy’ itself is controversial in scientist circles given connotations of ethno-nationalism, and the US-China rivalry provides an ugly backdrop to what is otherwise an exciting phase of innovation.

Scientists themselves, it must be stressed, focus on collaboration and what can be achieved for humankind. Still, the possibility of the Chinese government controlling the pre-eminent technologies in the space is unconscionable to the west - with good reason, given quantum computing's potential implications for cybersecurity.

Obstacles exist to quantum computers cracking the RSA-encryption (an acronym of Rivest-Shamir-Adleman, its developers’ last names) the internet is built on. But these aren’t insurmountable, and a hostile state actor having that capability is something no government can be complacent about.

The mathematics has existed for many years: back in 1994 Peter Shor (now a professor of mathematics at MIT) published his algorithm capable of cracking RSA-encryption. The world is still awaiting the computers that can employ Shor’s algorithm.

Harking back to the Y2K bug, Andersen Cheng, chief executive of quantum protection company Post-Quantum, discusses the notion of a 'Y2Q' for this watershed - the date when computers will be able to break the encrypted security of virtually any device or system. The crucial distinction, Cheng says, is whereas the millennium bug had a definite deadline but a wholly unknown impact (which turned out to be minimal), the impact of Y2Q will be seismic – we just can’t pinpoint the exact date it will strike.

The world isn’t completely unprepared. At the start of last month, the US National Institute of Standards and Technology (NIST) unveiled four algorithms it believed were candidates to be expanded on as a basis for maintaining digital security when RSA is made obsolete.

To underline the challenges, one earlier NIST candidate had been cracked by the end of July. But it deployed something called elliptic curve cryptography and was quite different from the algorithms to be scaled into post-quantum secure systems. Most of the July standards are based on lattice-cryptography, which relies on the properties of lattice problems (where more than one vector of numbers can be the basis of the set, making it hard to solve).

Nonetheless, entrepreneurs are backing quantum security being big business and Cheng cites Gartner research that estimates the market will be worth $9.3bn by 2029. That figure, Cheng emphasises, relates solely to chips and hardware security and doesn’t account for the need to protect public key encryption that safeguards the myriad of platforms we use for shopping, streaming, meeting and messaging.

Asserting that enterprise software globally will need to be updated, Cheng is bullish on the prospects of quantum protection – an application relating to cybersecurity alone – eventually being more than a trillion-dollar industry.

In the meantime, governments and companies must anticipate threats. Cheng coined the phrase “harvest now, decrypt later”, arguing that although the processing power to implement Shor’s algorithm may be some years off, malicious actors need merely to gather the code they wish to crack now, then bide their time until they do have the quantum capability to unlock a trove of sensitive information.

NIST's efforts are a crucial bulwark. While it was disheartening to lose a candidate that diversified the mathematical options for post-quantum protection, the prominent lattice-encryption isn’t the only game in town and there will be future candidates. Other encryption options include those that are code-based (containing random errors making it harder to recover the true code structure) and hash-based (that rely on the one-way, ie essentially untraceable, properties of 'hash functions' to arbitrarily map data).

In any case, organisations shouldn’t sit on their hands, and don’t necessarily have to replace all their IT infrastructure in one fell swoop. Cheng advocates a belt and braces approach, wrapping a layer of post-quantum protection, such as protection based on successful NIST candidates, around old RSA-encrypted systems in order to get quantum ready in the shortest possible time.

 

No coin Schadenfreude...

It is not just real-world systems at risk. Traditional investors irked by the crypto craze will have found it hard holding back an “I told you so” as digital assets sold off after November 2021, but market volatility is arguably a lesser threat to blockchain technologies than quantum computers are.

“Today’s blockchain technology is built around existing RSA and ECC encryption algorithms, which are extremely vulnerable to quantum computers once they scale up," explains Hidary. “Simply put, an error-corrected quantum computer can reverse-engineer a user’s private key  known only to them – based on information stored in the public key that becomes visible the first time it is used – as is the case when someone buys or sells bitcoin.

“Once the private key is cracked, an adversary can forge digital signatures and authorize transactions – such as cleaning out someone’s bitcoin wallet.  This is why we all must update blockchain security with quantum-safe technologies.”

Andersen Cheng points out that between 10 and 20 per cent of bitcoins were mined using the earliest protocols, which would likely be vulnerable. The modus operandi of the web3 and blockchain industries is to “correct as we go”, which does leave questions about legacy code on networks. Thus, even if future forks (ie technical divergences) in chains are more robust, some of the older code could be prone to the harvest now, decrypt later mindset.

In fairness, quantum computers are yet to reach the level of power and stability to be utilised in this way: “we’re 10 plus years away... probably more... from anybody having a quantum computer that is large enough, error corrected, and universal and fault tolerant in order to be able to do that sort of processing," says Lanes.

Like all technology reliant on RSA-encryption, blockchain will need to adapt, but it will not necessarily be made obsolete by quantum computing.

“The two technologies are not competitive at all”, says Hidary. “One excels at rapidly processing extraordinarily large quantum equations and the other is a distributed ledger. The only nexus of the two relates to cyber security.”

While that is an enormous caveat, the threat of quantum computers killing blockchain does seem to be remote for now. Both are exciting technologies and patient investors may be very well rewarded. Of course, there are no guarantees, but a crucial difference with quantum computing is that exposure can be had buying shares in established listed companies. To paraphrase the old slogan, you probably won’t get fired for buying IBM.