Of the various areas of technological innovation being explored in 2016, quantum computing has had a particularly confusing journey and threatens to force significant changes in numerous spaces. Yet for all the consternation over what the fallout of the widespread adoption of quantum computers will be, the field is a tortuously slow journey from theory to prototype to market. Compare the development of quantum computing to the development of the cloud, which started around when companies like Amazon began selling the use of their data-centers in the mid-2000s. By 2010, the cloud was a ubiquitous feature of the modern computing landscape. Even virtual reality has had faster journey to the market since 2012 with the first prototype of the Oculus. So what factors have been constraining the development of fully functional quantum computers? Moreover, what can quantum computers do and how will we use them when they do arrive for consumer and business use?
First, it will be necessary to understand what quantum computing is and what it can be, as the concept can be hard to grasp even for computer scientists. In its simplest form, quantum computing refers to the practice of using phenomena from quantum physics to perform computation. In particular, quantum computers rely on the concepts of superposition and entanglement to perform computations that are impossible or extremely difficult for classical computers. Unlike classical computers, which store information and compute on arrays of distinct bits which can either be one or zero (on or off), quantum computers use arrays of quantum bits or qubits which can be both one and zero until observed, a phenomenon known as superposition. Additionally, when one sets up a system of qubits in a particular manner, they can become entangled, which allows for the storage of more information than would be possible with a group of classical bits of the same size (1). This combination of superposition and entanglement allow quantum computers to perform operations on a range of inputs in parallel on the same group of qubits (1). This unique variety of computing can be used for a number of applications for which classical computers are poorly suited. For one, quantum computers may be much better at simulating complex systems like the folding of proteins as they can operate on all inputs before giving the result for the correct input (1). More ominously however, quantum computing has the potential to break internet security by compromising certain commonly used encryption schemes. For example, the commonly used public key encryption scheme RSA is threatened by quantum computers because they can factor numbers’ orders of magnitude faster than traditional computers. This is a problem because RSA relies on the premise that large numbers are not easy to factor to ensure that attackers cannot use publicly shared keys to derive private information which can be used to decrypt encrypted messages.
It should be noted that while quantum computers have the potential to break encryption like RSA, current generation of quantum computers are not big or powerful enough to pose a threat to encryption systems. The largest number known to have been factored by a quantum computer (56153) is still 64 to 128 times smaller than the numbers used in RSA encryption keys (2). While it is relieving that quantum computers are not particularly large or powerful, it is also perplexing that quantum computers are so weak compared to their classical counterparts in terms of processing power. Part of this phenomenon can be explained by the fact that quantum computers are still fairly new. The first working quantum computers were developed less than two decades ago. By comparison, it took more than 30 years for classical computers to go from prototypes like the ENIAC and MANIAC I to widely available consumer models like the TRS-80 and Apple II.
While the slow growth in power and lack of widespread adoption of quantum computers may be explainable by the relative newness of the field, there are a few factors which are likely to hamper the growth in quantum computer processing power in the coming years. One hurdle faced in the campaign to create more powerful quantum computers is difficulty in maintaining quantum effects in larger systems. It becomes harder to keep a quantum computer’s qubits entangled and in a state of superposition as more qubits are added, as doing so increases the chance that other particles will interact with the system, forcing it into a definite state and losing the advantages that quantum effects give. In addition to the problem of scalability, quantum computers are also much harder to build. A qubit can theoretically be represented by any system which can have two states. In practice however, most quantum computers represent qubits using photons, flaws in the carbon lattices of diamonds, or the valence electrons of phosphorus, which all pose problems when thinking about how to cheaply manufacture qubits that can be placed on a computer chip. Quantum processors also suffer from the fact that it is harder to transfer information to and from them from memory. Compared to the processor of a classical computer, where ingoing and outgoing information is basically just electrical current (or the lack thereof), qubits composed of magnetic fields or microscopic particles use mechanisms for detecting and setting state that are significantly more complex and expensive.
Do all of these hurdles mean that striving to build quantum computers is a fruitless pursuit? Not necessarily. While progress towards building bigger, more powerful quantum computers has been slow, it is still being made. Just last year, D-Wave Systems revealed the first quantum processor with more than 1,000 qubits (3). While a far cry from the billions of bits that can be stored by the typical computer, it represents an impressive leap compared to the 3 qubit quantum computers created in the late 90s and early 2000s. So what will the future of quantum computers look like? If the history of classical computers is any indication, one would expect governments, universities, and very specific business interests to make use of the current generation of quantum computers, which are still quite bulky and require considerable space and resources. As quantum computer hardware manufacturers become better at building components and more programmers learn about and develop for quantum computers, quantum computers may eventually find use in the hands of the average consumer and businesses. It is still uncertain as to how the average users will access quantum computers. Since qubits require temperatures near absolute zero and high levels of stability and isolation to effectively utilize quantum effects, it may be untenable to perform quantum computations on a personal computer. Barring developments that allow quantum computers to function at room temperatures in everyday conditions, it seems more likely that consumers will have access to quantum computers through the cloud. Rather than trying to build a quantum chip that can be integrated into personal computers, it may be easier to set up data centers with thousands of quantum computers the use of which can be bought and distributed as needed by users. We are already seeing groups like IBM giving users access to quantum computer by allowing them to remote access their systems (4).
(1) Altepeter, Joseph B. "A Tale of Two Qubits: How Quantum Computers Work." Ars Technica. Conde Nast, 18 Jan. 2010. Web. 20 July 2016.
(2) "The Mathematical Trick That Helped Smash The Record For The Largest Number Ever Factorised By A..." Medium. A Medium Corporation, 02 Dec. 2014. Web. 22 July 2016.
(3) D-Wave. "Breaks the 1000 Qubit Quantum Computing Barrier." Dwavesys. D-Wave Systems, 22 June 2015. Web. 21 July 2016.
(4) IBM. IBM Makes Quantum Computing Available on IBM Cloud to Accelerate Innovation. IBM.com. International Business Machines Corporation, 4 May 2016. Web. 21 July 2016.
Image: © Welcomia | Dreamstime.com - <a href="https://www.dreamstime.com/royalty-free-stock-photos-nano-technology-image29230388#res14972580">Nano Technology</a>