Industry attributes
Other attributes
Quantum computation utilizes quantum mechanical effects, such as superposition, entanglement, and interference, to perform computation. Classical computation relies on the ability to store and manipulate binary integers (bits) generally made from silicon transistors. Instead, quantum computers make use of quantum bits (qubits) that can have the value 0, 1, or a superposition of these 2 states. Leveraging superposition and entanglement to create states that scale exponentially with the number of qubits offers the potential for dramatic improvements in computation power compared to classical computers.
A quantum mechanical model of a computer was first conceived by physicist Paul Benioff in 1980, inspired by an Alan Turing paper written in 1936. Richard Feynman, then a professor at Caltech, described how quantum computers could be used to simulate physical systems in a paper published in 1982, titled "Simulating Physics with Computers." These ideas were developed by David Deutsch in 1985, describing the concept of a universal quantum computer to understand the mathematical potential that was possible. In 1994, Peter Shor developed “Shor’s algorithm,” allowing a quantum computer to perform factorization of large numbers significantly faster than the best classical algorithms.
Development continued through the 1990s and 2000s, with significant contributions from scientists David Wineland, Christopher Monroe, and Lov Grover, companies and institutions such as the Technical University of Munich, Los Alamos National Laboratory, IBM, D-Wave, and Google. On October 23rd, 2019, researchers at Google published a paper claiming they had achieved quantum supremacy using the Sycamore quantum computer. Quantum supremacy refers to solving a problem that could not be solved by a classical computer in a reasonable amount of time. Although some disagreed whether Google had achieved true quantum supremacy, it is seen as a significant breakthrough in quantum computation.
Current quantum computers struggle with errors in the form of noise and loss of quantum coherence, as well as engineering challenges. However, they have found applications in fields such as cryptography and drug development.
The quantum states required to make quantum computing feasible are easily destroyed due to noise from unintended interactions with the environment or the imperfect implementation of quantum logical operations. Producing scalable quantum computers requires an effective error correction method that prepares the logical state, applies logic gates, and detects and corrects errors without introducing more errors into the system. An error correction method that reduces the overall number of errors present in the system is described as fault-tolerant, whereby one error does not spread to create more errors.
A fault-tolerant quantum computer could process encoded quantum information without the serious propagation of errors, in principle reliably performing arbitrarily long quantum computations. This requires that the average probability of error per gate is less than a specific threshold, such that error correction can be applied effectively.
This field is known as quantum threshold theorem. It was described by computer scientist Scott Aaronson as:
The entire content of the Threshold Theorem is that you're correcting errors faster than they're created. That's the whole point, and the whole non-trivial thing that the theorem shows. That's the problem it solves.
Error correction in classical computing can be performed using redundancy, whereby data is stored multiple times and compared for discrepancies. In quantum computing, this type of error correction is not possible, due to the no-cloning theorem. To overcome limitations from the no-cloning theorem, error correction in quantum computing requires the use of multiple highly entangled physical qubits to spread out the information present within a single logical qubit.
A physical qubit is a physical device that behaves as a two-state quantum system, whereas a logical qubit may be physical or abstract and performs as specified by a quantum algorithm or quantum circuit. It is logical qubits that perform the calculations for quantum computation.
The power of quantum computing is derived from qubits existing in a superposition of two states (0 or 1) at the same time. Measuring the quantum state of a logical qubit to check for errors would destroy this superposition, causing the system to collapse to a definite state (0 or 1). Therefore, quantum error correction has to use physical qubits to infer whether an error has occurred without directly measuring the state of any of the qubits.
Consider a logical qubit encoded into three physical qubits. If the three physical qubits are not all in the same state, it indicates an error has occurred.
Measuring any of these four qubits would destroy the superposition (they currently exist in a combination of both 0 and 1). Therefore, to check for an error requires the use of two additional "ancilla qubits" to compare the states of the physical qubits. By measuring the states of these two ancilla qubits, one can determine whether the three physical qubits are in the same state, and if not, which one is different, without discovering which state they are in and destroying the superposition.
From the diagram, ancilla qubit X compares the physical qubits A and B, and ancilla qubit Y compares the physical qubits B and C. Measuring X and Y, one can determine if they both match (no errors), X does not match and Y does (physical qubit A has an error), both X and Y do not match (physical qubit B to common both ancilla measurements has an error), or X matches and Y does not (physical qubit C has an error).
This is a simplified example of quantum error correction that only protects against bit flipping (only possible error in classical computing). However, quantum computation utilizes the qubits phase as well as error, and this is another potential source of errors.
In October 2021, a group of researchers led by Chris Monroe of the University of Maryland achieved fault-tolerant control of an error-corrected qubit (single logical qubit). The team used a 13-qubit encoding known as the Bacon–Shor code that spreads the quantum information of a logical qubit into the quantum states of nine ions. Then uses four ancilla qubits for error correction. The team showed that they could fault-tolerantly perform all the single-qubit operations necessary for quantum computing.
While this result proves the possibility of fault-tolerant quantum computers, the advantages of error correction require quantum computers with significantly more logical qubits (>100), and therefore 13x that number of physical qubits.
Startups developing fault tolerant quantum computers and quantum error correction software
- Quantum computing
- Quantum mechanics
- Quantum superposition
- Quantum entanglement
- Quantum non-locality
A Qubit (quantum bit) is the quantum computing analog of a classical bit. Unlike classical bits (which have the value 0 or 1), qubits are two-level quantum systems that can have the value 0, 1, or a linear combination of both states caused by quantum superposition.
Quantum computing hardware requires qubits that can scale to the numbers required to demonstrate computing advantages compared to classical machines (>10,000 qubits). A number of qubit systems are under research for use as quantum computers:
- Neutral atom qubits
- Superconducting qubits
- Quantum dot qubits (spin qubit quantum computing)
- Trapped ion qubits
- Photonic qubits (Linear optical quantum computing)
By performing local measurements on systems, special correlation can be obtained. Such correlations can not be characterized by local hidden variable model, which are named as non-local correlation. It can be detected by Bell inequality. In short, if some given correlations that can not be produced by classical methods, they are non-local.
- Quantum threshold theorem
- Quantum machine learning
- Quantum Memory
- Quantum information
- Quantum photonics
- Integrated quantum photonics
- Quantum non-locality
- Quantum entanglement
Advances in deep learning are expected to increase understanding in quantum mechanics. It is thought that quantum computers will accelerate AI. Quantum computers have the potential to surpass conventional ones in machine learning tasks such as data pattern recognition.
Other applications of quantum computing:
- Cybersecurity
- Drug development
- Financial Modeling
- Better Batteries
- Cleaner Fertilization
- Traffic Optimization
- Weather Forecasting and Climate Change
- Solar Capture
- Electronic Materials Discovery