Quantum Computing: The Next Computer Revolution
Dec 09, 2019
The history of computing shows that modern computers seem like a miracle of technology compared to what they first were. Computers essentially went from giant room-sized disks to tiny microchips that can fit in your pocket. Modern computers can do amazing things, especially in the domain of data analytics, and computation is continually becoming cheaper and more powerful. That said, there are types of problems modern computers will never be able to solve. Those involved in scientific research or data analysis often brush up against problems that are intractable for classical computers. To solve these kinds of problems, you need a different kind of computer. Namely, a quantum computer.
Take quantum physics research, for example. Classical computers cannot model all the variables present in quantum systems, so they settle for estimations. A computer built from a quantum system, however, can model all those variables. This is important for chemistry, as well as materials science. Outside of science, there are optimization/simulation problems that involve analyzing so many combinations of variables it would take impractical amounts of time for even the most powerful classical computers to solve them. These kinds of problems arise in finance, supply chain, artificial intelligence and machine learning, as well as climate and weather science. Modern encryption is largely based on the idea that classical computers cannot easily find factors of large numbers. Quantum computers can run algorithms that compute these types of problems exponentially faster that classical computers, making them solvable in a practical time frame. As an example, Google recently used a quantum computer to perform, in 200 seconds, a computation that it would take the world’s fastest supercomputer 10,000 years to perform.
How does quantum computing work? Without getting too technical, it takes advantage of the way matter behaves at the smallest level. Particles such as photons can exist in two places at once (called superposition), and they can become entangled with other particles such that the properties of one particle will tell you the properties of another, regardless of distance (called entanglement). While a classical computer must process information coded in bits of either one or zero, a quantum bit (called a qubit) can process a zero and one simultaneously. This allows for massive parallel processing, where two qubits can process four combinations of zeros and ones simultaneously, three qubits can process eight, and so on with the number of combinations doubling with every qubit added. So, a sufficiently large quantum computer allows the user to process all the combinations of 0 and 1 they desire to-- simultaneously. The exponential increase in combinations adds up very fast, with only 300 qubits needed to process more bits of information than the number of particles in the universe.
There is still more progress to make before quantum computers become practical, however. A main limiting factor is the sensitivity of the quantum phenomena that make it possible. Superposition is extremely fragile, even interacting with a stray photon will cause it to collapse. Entanglement becomes extremely hard to maintain the more qubits you are trying to entangle together. That said, it is time to start preparing for the coming revolution. Physics and chemistry research is already taking advantage of quantum computers to better simulate the interactions between small particles. Cryptography methods, robust to the capabilities of quantum computers, are already being developed for security. Very basic quantum computers are being used to improve traffic management systems. What remains to be seen is just what technologies quantum computers will power in the future.