In August 2016, China’s first, and also the world’s first, quantum communication satellite ‘MoZi’ was launched. In June 2017, the world’s first experiment on quantum information across the earth and space-the separation of a pair of entangled photons over 1200km -was realised through the satellite. The first quantum computer was sold by the Canadian company D-Wave at a price tag of USD10 million in 2011. Google established its quantum AI lab in 2013. IBM made a 50 quantum-bit cloud computer in 2017. Over the past decade, the topics of quantum information and quantum computation have appeared constantly in media; but what do they really mean? Would we stop using the current computers and switch to quantum computers in the future?
Algorithm and Parallel Computing
To answer these questions, we have to begin our story in the last century. With the invention of the transistor in 1947 and the integrated circuit in the 1950s, the power of computers greatly increased. Computer scientists made use of the newly acquired power to solve problems that are beyond the calculation capacity of humans, such as factoring large integers or encrypting messages by scrambling some special functions. To reduce computation complexity and thus elevate the computing speed in these tasks, the scientists proposed what they called probabilistic algorithms’ to compress the computing time from days to minutes by increasing the parallel concurrence in the algorithms.
In order to appreciate the difference of probabilistic algorithms from regular algorithms we can consider a specific example: for example, to find the factors of 14. Besides the trivial factors 1 and 14, the stupidest way to solve the problem is to try dividing 14 by the numbers 2 up to 7 one-by-one. If it is divisible with a remainder of 0, then we can determine
the relevant number is a factor. Computers are good at trying one-by-one. Hence, it’s easy to figure out that 2 and 7 are the only two factors for 14 and these brute-force methods are also easy to program. However, we would soon realise that the brute-force methods are useful to deal with integers within 100,000 or 1,000,000 given a powerful computer. If we are to deal with an integer with, say, 14 digits, we are going to hit a brick wall since the number of integers we need try dividing is simply too many.
Parallel computing is our saviour. Suppose we use two computers to try factoring 14, one trying the factors from 2 to 4 and the other 5 to 7; then the computing time is instantaneously reduced to half the original. In other words, if the processing unit of a computer can integrate N computing cores, the computing time complexity would be reduced to 1/N. Nonetheless, this type of simplification is still not efficient enough for factoring a very large integer.
Yet, if we allowed to try factoring randomly, the efficiency of successfully finding a factor would be greatly improved, probabilistically speaking. For instance, we have six numbers 2,3, 4,5, 6, 7 between 2 and 7. If we do the division of 14 not sequentially but by picking randomly the candidate factors from these six numbers, it’s likely that the first one we pick is 7 and the second one 2. In that case, we don’t even need to try 3, 4, 5, and 6 before we have found all factors of 14. This is the motivation behind the so-called probabilistic algorithms.’
The Merits of Quantum Computing
Inspired by the concepts of parallel computing and probabilistic algorithms, Deutsch,C C Yao, andother computer scientists employed the world view of quantum mechanics-a system can retain a superposition state to simultaneously coexist on two different physical settings – in 1980 s to have invented the mathematical model of quantum Turing machines. In other words,supposing every integer between 2 and 7 canbe represented as a physical state, we then only need one quantum system to form their superposition state and try out the factors in parallel without increasing the number of computing cores.
Meanwhile, since there is aprobability associated with each state of integer out of the collective superposition state,these probabilities would increase or decrease concurrently through the execution of quantum algorithms. The probabilities associated with the real factors of 14 would gradually approach 1 in consequence. Therefore, the advantage of quantum computation is that it reduces the consumption of memory space while alleviating the computing time in temporal complexity. Shor of Bell Laboratory proposed just such a probabilistic integer-factoring algorithm geared specifically for quantum computer in 1995. But so far quantum computation still rests on paper.
After the year 2000, quantum physicists including Girvin, Martinis, and Tsai successfully fabricated what they called superconducting quantum-bit (or qubit) systems by carefully manipulating the Josephson effects on superconducting materials. Thereafter, the superposition state necessary for quantum computation can be generated in a solid-state circuit in a controllable fashion. The Martinis group has implemented Shor’s algorithm to factor the integer 15 using a superconducting circuit comprising four qubits.
It seems then the making of the quantum computer has already succeeded and we can make a head start on commercialising them. But in reality, there is a long road ahead before one can buy a personal quantum computer. The two most imminent problems are: (i) the effective data storage time in a qubit is not sufficiently long, being only on the scale of microseconds; and (ii) the scaling mechanism to share data across qubits has not yet existed. The latter is also the reason why we can only factor a small integer like 15 so far. Therefore, one of the current research directions in our research group at UM is to make use the properties of solitons on superconducting circuits to prolong the effective storage time, thus solving the first problem.
But even if the two aforementioned problems are completely solved, does that mean that the mission to conquer quantum computation is accomplished? Not quite. Because up to now we are only using the principles of quantum mechanics to improve the algorithms for factoring, searching, etc. within the conceptual framework of conventional computers. Is it possible to employ the postulates of quantum mechanics to exceed the given structure of Turing machines, making some tasks originally incomputable computable? About this, we do not yet know the answer.
Source:umagazine issue 19、UM WeChat
Author:Hou IAN
The author is an associate professor of the Institute of Applied Physics and Materials Engineering, University of Macau. He has established the cryogenic quantum computation laboratory and conducted research studies on quantum optics and quantum information processing using superconducting circuits. Through his research experience at the Institute of Theoretical Physics of the Chinese Academy of Sciences, he has developed an interest in astrophysics, and currently teaches a general education course on the mysteries of the universe.