Why Every Bank Must Prepare for Quantum Computers Now
Quantum computers are now a fact. In their fully realised form, they threaten the informational advantages enjoyed by governments as well as industries, not least by breaking all existing data encryption techniques. If the engineering obstacles to high performing quantum computers can be overcome, a new range of geopolitical and even existential threats will emerge, but so will solutions to problems previously thought insoluble. The commercial opportunities are legion. In financial services quantum computing threatens the informational advantage of the arbitrageur but is already transforming portfolio optimisation. Derivative pricing and credit scoring are next.
Quantum computers are already being designed, built and operated to standards that match the power and performance of classical computers, and their power is doubling every 12 months.
The processing power of quantum computes grows exponentially, making it certain that their performance will surpass that of classical computers, probably within the next two to three years.
The subsequent impact of quantum computing on the financial services industry will be profound, unveiling new patterns and outliers for asset managers and arbitrageurs as well as eroding current competitive advantages.
Financial services, with its substantial resources, heavy reliance on algorithms, large quantities of price data and ability to profit from speed, is well-suited to making use of quantum computing. A number of banks are already investing in the field.
Quantum computing threatens to make all existing systems of data encryption redundant. Methods at risk include those most commonly used today, which rely on the difficulty of factorising prime numbers or computing discrete logarithms.
Hostile governments are reported to be archiving material they have hacked but not yet decrypted, in the expectation that quantum comping will help them to read it within two to six years. The case for adopting new methods of encryption immediately is strong.
A contest to develop new forms of encryption is currently being run by the National Institute of Standards and Technology (NIST). Quantum computing offers an encryption technique of its own, which generates matching keys from entangled particles. It is physically impossible to hack successfully.
Quantum computers nevertheless face engineering problems, including the need to protect the qubits (which are a quantum state) from disturbances such as noise and vibration (by lowering the temperature) and the behaviour of adjacent qubits.
A substantial engineering challenge is posed by the need to provide sufficient resources (additional qubits) to correct for errors. The present solution is to use hybrid models of quantum and classical computers to economise on qubit consumption.
Programming languages are now available which allow non-specialist developers to write code for quantum computers without needing to understand the physics by which the machine operates.
For investors, excitement about the potential of quantum computing has not yet translated into hype. There is sufficient uncertainty to deter ignorant enthusiasts, but enough progress to encourage specialist investors to back teams if not specific technical approaches.
Governments are investing in quantum computing, and not all of them have benign intentions. There is also a concern that quantum computing in tandem with artificial intelligence (AI) could give rise to artificial general intelligence (AGI) that imperils the survival of humanity.
Yet quantum computers also bring humanity face-to-face with the quantum universe. If there are not enough bits in the physical universe to accommodate the computations of a 300-qubit quantum computer, the computation must be taking place outside time and space.
Quantum computers have the potential to transform everything. In theory, their speed and performance makes it possible to run algorithms and simulations that are impossible with classical computers. In practice, the quantum computers that are being designed, built and operated today are already almost as powerful as the most powerful classical computers.
Quantum computers are on the cusp of overtaking classical computers
Quantum computers are already running financial algorithms at least as diligently as classical computers, and within the same fault tolerance parameters. In fact, they are on the cusp of overtaking classical computers.
Once they clear that threshold, a major commercial impact on financial services is certain. This could happen within just two to three years, so it is now imperative for anyone working in financial services to understand what quantum computing is and what it could become.
A good starting point is to abandon straightforward comparisons with classical computers. Quantum computers embody the first conceptual advance in the processing and storage of information by computers since Charles Babbage (1791-1871) introduced the idea of the on-off gates or ones-and-zeroes computer with his “difference engine” in 1822.
Quantum computers rely not on the bits (which can be either a one or a zero only) used by classical computers but on qubits (which can be a one and a zero at the same time). Stranger still is their reliance on the entanglement of particles – a pure quantum effect for which there is no equivalent in classical physics. It means the value of one qubit can depend on the value of another qubit even if they are widely separated.
The fact that a qubit can be both a one and a zero at the same time is the “superposition,” the term physicists use to describe how a qubit can be in multiple positions simultaneously. This gives quantum computers the power to complete calculations that a classical computer either cannot complete at all or cannot complete in a reasonable amount of time. This is not a theoretical possibility either: Google proved in 2019 that a quantum computer can complete tasks that a classical computer cannot.
A 300-qubit quinetum computer offers as many configurations as there are atoms in the universe
Quantum computing power grows exponentially, not linearly. Every time one qubit is added, the number of possible configurations doubles. One qubit permits two configurations; two qubits four configurations; three qubits eight configurations, and so on. At just 300 qubits, the number of configurations is equivalent to all the atoms in the universe, and a machine capable of handling several hundred qubits could be available as soon as this year (2021).
In principle, it is not possible to build a computer more powerful than all the matter in the universe. In other words, quantum computers are not incrementally faster or more powerful in the manner of an iPhone 12 by comparison with an iPhone 11. Instead, quantum computing enables humankind to tackle problems that were previously insoluble.
Quantum hardware is also developing faster than anticipated. There are several varieties of it. They include superconducting qubits (the basis of the quantum processing machines built by Google, IBM and Rigetti), Ion-Trap (AQT, Honeywell and IonQ), photonic quantum computing (PsiQuantum and Xanadu), atom spin quantum computing (UNSW, Sydney) and room temperature qubits (Archer Materials).
Quantum computing power makes Moore’s Law look pedestrian
Quantum computing power is growing faster than Moore’s Law. Unlike classical computers, whose power doubles every 18 months, the power of quantum computers is doubling every 12 months or so. The current trajectories in quantum volume – which takes into account the quality and number of qubits, and their appearance time – range from doubling every 12 months (IBM) to increasing by a factor of ten every year for the next five years (Honeywell).
If Honeywell is proved right, quantum computing power will increase 100,000 times by 2025. IonQ says it can deliver a quantum processor with a quantum volume of 4 million already. What is clear is that quantum computing is on a path to increase its processing power significantly over the next few years, so the prospect of a meaningful impact on industries and markets may be closer at hand than informed observers expected even a year ago.
In many industries well-known companies are already experimenting with quantum computing. The earliest sectors to be affected were agriculture and pharmaceuticals (Biogen, BASF, Bayer, DuPont, JSR and Merck), transport (Airbus, BMW, Bosch, Daimler, Nasa, VW) and information technology (AliBaba, Google, IBM, Microsoft and Samsung). So financial services are not necessarily going to lead the application of quantum computing.
Financial services are well-suited to transformation by quantum computing
However, financial services firms as various as Barclays, BBVA, Caixa, Citi, Goldman Sachs, ING, J.P. Morgan, Scotiabank and Wells Fargo are all reported to be investigating the competitive advantages of quantum computing. Banks are building in-house teams and launching collaborative ventures with specialist quantum computing houses.
The interest of banks in quantum computing is scarcely surprising. Unlike hardware engineering, where algorithms play a relatively small part in the value of the final product, in financial services algorithms are the primary determinant of value. In financial services, an improvement of even one tenth of one per cent can lead to massive rewards.
Financial markets also generate vast quantities of data, creating immense computational needs for participants in markets where speed confers massive advantages. Asset pricing tools which make use of Monte Carlo simulations, to take one example, can process calculations quadratically faster on a quantum computer. In other words, a quantum computer can, in just 1,000 simulations, achieve the same degree of accuracy as a classical computer manages in one million simulations.
Financial services firms also tend to have sufficient resources to invest as well as powerful incentives to do so. But it is unclear how many of the experiments conducted by banks are more than exploratory or moving beyond proofs-of-concept.
There is an obvious opportunity in arbitrage, where a great deal of money is being made by sophisticated asset managers. Quantum computing will erode the informational advantage they currently enjoy, empowering a new class of competitor. In fact, classic portfolio theory might well be displaced altogether by quantum computing.
Indeed, quantum computing is being used already in portfolio optimisation, where it is more adept at identifying outliers than classical computers. Quantum computers are also being used in credit scoring, and a considerable literature has developed on the application of quantum computing to the pricing of derivatives. Beyond trading and investing, fraud and financial crime are seen as areas where quantum computers can reduce current levels of loss.
Quantum computing threatens all existing systems of data encryption
Financial services firms are alive to the risks created by quantum computing too, particularly in its potential impact present methods of cyber-security. At present, sensitive information is exchanged in encrypted formats which the recipient can unlock with a private key. Unhappily, quantum computers are in theory adept at unravelling the encryption formulas, and their power is increasing fast enough to make decryption a meaningful threat within the next few years.
The public key formulas vulnerable to decryption by quantum computers include RSA encryptions (which rely on the difficulty of factorising prime numbers), Diffie-Hellman key exchanges and Elliptic curve cryptographies (which both rely on the difficulty of computing discrete logarithms).
At present, these three encryption techniques account for 99 per cent of online encryption. There are reports of hostile governments archiving encrypted material already and waiting for their quantum computers to be powerful enough to decrypt them.
The longer the key, the more qubits are required to break an encryption code. As a result, it is tempting to believe that lengthening keys and refreshing them continuously can postpone a reckoning with quantum computers. However, because the processing power of quantum computers increases exponentially, even doubling the length of keys provides a marginal advantage only.
One estimate is that it will take 20 million qubits to break RSA encryption. The most optimistic projections of qubit volume (from Honeywell) indicate that point will be surpassed somewhere between 2023 and 2024. Even a less worrying projection holds that there is a one in six probability that quantum computers will break RSA encryption by 2027.
There is a strong case for adopting new encryption techniques immediately
Ultimately, what matter is the certainty that quantum computers can break existing encryption codes, and the near-certainty that more efficient methods than deploying 20 million qubits will be found. This is a strong argument for public and private organisations to adopt new methods of encryption immediately, especially if hostile regimes are archiving existing material already.
It helps that the United States cyber-security agency, the National Institute of Standards and Technology (NIST), is hosting a post-quantum cryptography contest to find alternative methods of encryption. The goal is to find techniques that can inter-operate with existing communications protocols but which are also invulnerable to hackers using quantum computers. On 22 July 2020 NIST announced that after two rounds it had whittled the submissions down to seven candidates.
Their ability of any of these seven techniques to defeat the threat posed by quantum computing is as yet unknown. NIST does not expect to finish testing them before 2023. Worse, even if the new techniques do work, it could take as long as 20 years to upgrade the security systems of government agencies and large corporations to work with the new methods. Which is why some organisations have embarked on upgrading their security systems already.
Quantum computing offers an encryption technique of its own
As it happens, quantum computing contains the promise of a cyber-security technique of its own, known as Quantum Key Distribution, which is capable of transmitting data securely through fibre optic cables or via satellites. The technique is being deployed in several countries already. In fact, China is reported to have launched a quantum satellite and so has a Singapore-based start-up.
Quantum Key Distribution makes use of entangled particles – where the value of one qubit depends on the value of another qubit even though they are widely separated – whose values are random, but random together.
If measured in the same direction by two independent devices capable of generating and receiving quantum states, they will always yield the same answer. The matching answers give the owners of the two devices a secret key known only to them which they can use to encrypt information.
If a hacker was able to replace one of the particles, the two independent devices would not yield the same answers, indicating to the holders their system had been hacked. They can then either accept the risk and proceed or generate a new key. In other words, fundamental physics – the fact that interference with changes the quantum state, making it detectable – renders it impossible for hackers to penetrate a system secured by quantum entanglement.
Quantum computers pose engineering problems that are still to be resolved
Nevertheless, quantum computing still faces formidable engineering problems. A qubit is a quantum state, and a quantum state is a fragile, even delicate, state of matter. Any kind of noise can disturb it and break the superposition.
Since it is the superposition which is responsible for the exponential processing power of quantum computers – and therefore the acceleration of the completion of the algorithm being processed – the exclusion of disturbances, or the ability to correct errors caused by disturbances, is crucial to a reliable process of computation.
Classical computers also commit errors but correct them by simple means such as adding three redundant bits for error correction to every bit, which can then cancel out the error by a simple averaging.
In a quantum computer, by contrast, qubits are entangled, so each qubit requires not three qubits for error correction but nine. Because of the additional problem created by external disturbances the rule of thumb in quantum computing is that it takes 50 qubits to get one error-free qubit.
The fact that errors occur more often than in classical computing creates an obvious engineering challenge: error correction requires more resources than the computation itself. The reason that quantum computing engineers rely on quantum volume as the measure of progress is that they have to trade qubits for computation off against qubits for error correction to arrive at an acceptable level of fault-tolerant calculation.
To optimise the machine requires minimisation of external disturbances through exclusion of noise and vibration and by reducing the temperature to extremely low levels. Indeed, this has prompted concern that quantum computers will violate the pervasive environmental, social and governance (ESG) mandates under which most banks and asset managers now labour.
There are other engineering obstacles. Keeping qubits separated on chips so they do not disturb each other when an electromagnetic pulse is sent to alter the quantum state of one qubit but not the other is one. Managing the errors which occur in the software code as it is written and developed by human programmers is another.
It is possible to programme a quantum computer without a PhD in quantum mechanics
Writing code for quantum computers is a challenge in itself. To complete a simple calculation such as adding two numbers, each number has to be translated into a qubit. At every step in the algorithm a matrix that describes “addition” has to act on the qubit. A qubit then has to be chosen for measurement that can give the correct answer at some acceptable level of probability.
Current programming languages, being based on logic gates, are too primitive to encompass this complexity directly. Likewise, programmers are unlikely ever to be sufficiently schooled in quantum mechanics to write the higher-level language required as it is needed. Instead, the software is being separated from the hardware in the same way that they are separated in classical computers.
Firms writing software for quantum computers do not distinguish between the different hardware models but use compilers to select the most efficient way to realise their objectives on each of the underlying physical platforms.
Qiskit, an open-source language for working with quantum computers, is now available. Users of a general-purpose programming language such as Python can use it to create algorithms and circuits via a familiar graphical interface, so developers can already write quantum code without needing to understand the physics of the processing machine itself.
The consumption of qubits for error correction is being worked around rather than solved
The real problem in quantum computing is the consumption of qubits for error correction. The short-term solution is to live with the limitation. American researchers have dubbed this acceptance of flawed quantum processing machines Noisy Intermediate-Scale Quantum (NISQ) computing.
NISQ economises on qubits by dispensing with the objective of full error correction. One NISQ technique is to write hybrid algorithms that use quantum computers where they can be most beneficial, run the rest of the algorithm on classical computers, and then synthesise the results.
If this approach proves the main axis of progress, quantum computers will (at least in the near-term) be operated not on their own but in combination with classical computers. It is likened to adding a QPU to the CPU and the GPU. In this sense, NISQ parallels the way that artificial intelligence (AI) has so far augmented rather than replaced human intelligence.
Integrating quantum computers with classical computers may present a further engineering challenge, but lack of sufficient bandwidth to carry the information flows between machines is not likely to be a problem. Users of quantum computers will tend to access them as another Cloud-based resource, so network connectivity is unlikely to be a constraint.
But in the end NISQ is a way of skirting the engineering challenges, rather than confronting them. That is understandable, since none of the challenges is trivial. That is evident in the multiplicity of approaches being tried in terms of techniques and physical materials: superconducting, ion trap magnetic fields, photonics and so on.
Investors in quantum computing back the people not the technicalities
For early users of quantum computers, this does not matter, provided the processing engine does useful work, since they are not invested in the technology in the same way as its developers – or investors. But it does pose a conundrum for early-stage investors in quantum computing: the technology could change the world, but it is impossible to know which of the various approaches will work.
The outcomes of any bets they do place are likely (no pun intended) to be binary. Yet they cover this risk not by spreading their bets through investing in every single approach but by investing in teams of people. Their expectation is that a competent team might start with the wrong model but can pivot to the most successful alternative relatively easily.
Competing models are oddly reassuring for investors. The uncertainty and complication of the technology is sufficient to deter the walls of money that drive up valuations in a classical Hype Cycle, such as that which drove the blockchain bubble in 2017-18.
No more than 250 investable companies around the world are working on quantum computing at the moment, and no more than 60 in Europe. There are no funds dedicated specifically to quantum computing. The industry has attracted perhaps US$100 million in investment so far, which is far below Hype Cycle levels.
Quantum computers have attracted the attention of hostile states
A more worrying possibility is which governments are investing in quantum computing, and to what ends. Clearly, any government which gained possession of a technology capable of breaking current methods of encryption would be in a powerful position.
Since quantum computing technology is widely available and does not require billions of dollars or a sophisticated infrastructure to support it, the risk of hostile governments using it for geopolitical purposes is a real one.
Quantum computing plus AI could lead to an AGI that threatens humanity
The links between quantum computing and AI portend something darker still. AI is about recognising patterns in large databases, and quantum computing is powerful enough to identify patterns which would otherwise remain invisible. Machine learning computers are already rewriting algorithms in ways human developers cannot understand, and the machines cannot communicate how they did their work.
The concern is that a combination of AI and quantum computing could give rise to Artificial General Intelligence (AGI), which passes out of human control altogether and chooses its own tasks and makes its own decisions. A weaponised robot armed with AGI, for example, would have the capability of deciding on the battlefield who lives and who dies. It is impossible to predict what an AGI will do.
Quantum computers perform their calculations outside time and space
On the other hand, quantum computing is a spur to the imagination. As Richard Feynman said in his 1981 paper Simulating Physics with Computers, the computer that simulates nature faithfully must simulate quantum mechanics. The fact that a 300-qubit quantum computer contains more storage capacity than all the atoms in the universe, and that such a machine is near at hand, means computing is now on the threshold of doing exactly that.
If there are not enough bits in the physical universe to accommodate the computations of a 300-qubit quantum computer, yet it still completes computations, it means that nobody knows where the computation in a machine of that capacity is actually happening. Quantum mechanics has an explanation for this. It is that the computation is taking place outside space and time (in what is known as the Hilbert Space).
When the quantum universe first disclosed itself just over 100 years ago, physicists were reluctant to believe the implication that nature was not deterministic but probabilistic. The assumption was that science had not yet discovered all the variables necessary to make sound predictions.
Since the Irish physicist John Stewart Bell proved, on the basis of quantum entanglement, that hidden variables cannot explain all the predictions of quantum mechanics, science has had to accept that nature is indeed entirely random.
The reason quantum computers must trade speed for accuracy, use qubits for error corrections and run algorithms more than once to arrive at the correct answer is because the quantum universe is random. That machines are now being built on the foundation of that extremely disconcerting discovery is proof, if any is needed, that even the most unpromising of scientific insights is capable of commercial application.
Questions to be addressed at Quantum Computing II
1. Is the performance of quantum computers still on track to overtake classical computers?
2. Has the threat to current encryption techniques accelerated or receded?
3. Is Quantum Key Distribution a practical possibility?
4. How are financial services firms looking to deploy quantum computing?
5. Are hybrid quantum and classical computer models still seen as the way forward?
6. What progress is being made in insulating qubits from disturbance?
7. Which of the various techniques – superconducting, ion trap, photonics, atom spin and room temperature qubits – is looking most promising?
8. Has venture capital investment in quantum computing increased?
If you would like to participate as a panellist please contact Wendy Gallagher at email@example.com
If you would like to participate in the audience please let us know below or contact Wendy Gallagher on the email above
If you would like to participate as a sponsor please contact Valerie Bassigny on firstname.lastname@example.org