0% found this document useful (0 votes)
18 views

Quantum Computing.

Uploaded by

saisiddhartha606
Copyright
© © All Rights Reserved
Available Formats
Download as ODT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Quantum Computing.

Uploaded by

saisiddhartha606
Copyright
© © All Rights Reserved
Available Formats
Download as ODT, PDF, TXT or read online on Scribd
You are on page 1/ 17

Index

1.History
2.Quantum information
3.Unitary operations
4.Quantum parrallerism
5.Quantum programing
6.Gate array
7.Communications
8.Algorithims
9.Search problems
10.Engineering
QUANTUM COMPUTER
A quantum computer is a computer that exploits quantum
mechanical phenomena. On small scales, physical matter exhibits properties
of both particles and waves, and quantum computing leverages this behavior
using specialized hardware. Classical physics cannot explain the operation of
these quantum devices, and a scalable quantum computer could perform some
calculations exponentially faster[a] than any modern "classical" computer. In
particular, a large-scale quantum computer could break widely used encryption
schemes and aid physicists in performing physical simulations; however, the
current state of the art is largely experimental and impractical, with several
obstacles to useful applications.
The basic unit of information in quantum computing, the qubit (or "quantum
bit"), serves the same function as the bit in classical computing. However,
unlike a classical bit, which can be in one of two states (a binary), a qubit can
exist in a superposition of its two "basis" states, which loosely means that it is
in both states simultaneously. When measuring a qubit, the result is
a probabilistic output of a classical bit. If a quantum computer manipulates the
qubit in a particular way, wave interference effects can amplify the desired
measurement results. The design of quantum algorithms involves creating
procedures that allow a quantum computer to perform calculations efficiently
and quickly.
Physically engineering high-quality qubits has proven challenging. If a physical
qubit is not sufficiently isolated from its environment, it suffers from quantum
decoherence, introducing noise into calculations. National governments have
invested heavily in experimental research that aims to develop scalable qubits
with longer coherence times and lower error rates. Example implementations
include superconductors (which isolate an electrical current by
eliminating electrical resistance) and ion traps (which confine a single atomic
particle using electromagnetic fields).
In principle, a classical computer can solve the same computational problems
as a quantum computer, given enough time. Quantum advantage comes in the
form of time complexity rather than computability, and quantum complexity
theory shows that some quantum algorithms are exponentially more efficient
than the best known classical algorithms. A large-scale quantum computer
could in theory solve computational problems unsolvable by a classical
computer in any reasonable amount of time. While claims of such quantum
supremacy have drawn significant attention to the discipline, near-term
practical use cases remain limited.

History
For a chronological guide, see Timeline of quantum computing and
communication.

Mach–Zehnder interferometer shows that photons can


exhibit wave-like interference.
For many years, the fields of quantum mechanics and computer
science formed distinct academic communities.Modern quantum
theory developed in the 1920s to explain the wave–particle duality observed at
atomic scales and digital computers emerged in the following decades to
replace human computers for tedious calculations. Both disciplines had
practical applications during World War II; computers played a major role
in wartime cryptography, and quantum physics was essential for the nuclear
physics used in the Manhattan Project.
As physicists applied quantum mechanical models to computational problems
and swapped digital bits for qubits, the fields of quantum mechanics and
computer science began to converge. In 1980, Paul Benioff introduced
the quantum Turing machine, which uses quantum theory to describe a
simplified computer. When digital computers became faster, physicists faced
an exponential increase in overhead when simulating quantum
dynamics, prompting Yuri Manin and Richard Feynman to independently
suggest that hardware based on quantum phenomena might be more efficient
for computer simulation.In a 1984 paper, Charles Bennett and Gilles
Brassard applied quantum theory to cryptography protocols and demonstrated
that quantum key distribution could enhance information security.
Quantum algorithms then emerged for solving oracle problems, such
as Deutsch's algorithm in 1985, the Bernstein–Vazirani algorithm in
1993, and Simon's algorithm in 1994.These algorithms did not solve practical
problems, but demonstrated mathematically that one could gain more
information by querying a black box with a quantum state in superposition,
sometimes referred to as quantum parallelism.

Peter Shor (pictured here in 2017) showed in 1994 that a scalable


quantum computer would be able to break RSA encryption.
Peter Shor built on these results with his 1994 algorithm for breaking the
widely used RSA and Diffie–Hellman encryption protocols, which drew
significant attention to the field of quantum computing. In 1996, Grover's
algorithm established a quantum speedup for the widely
applicable unstructured search problem.The same year, Seth Lloyd proved that
quantum computers could simulate quantum systems without the exponential
overhead present in classical simulations, validating Feynman's 1982
conjecture.
Over the years, experimentalists have constructed small-scale quantum
computers using trapped ions and superconductors. In 1998, a two-qubit
quantum computer demonstrated the feasibility of the technology,and
subsequent experiments have increased the number of qubits and reduced
error rates.
In 2019, Google AI and NASA announced that they had achieved quantum
supremacy with a 54-qubit machine, performing a computation that is
impossible for any classical computer. However, the validity of this claim is still
being actively researched.
The threshold theorem shows how increasing the number of qubits can
mitigate errors, yet fully fault-tolerant quantum computing remains "a rather
distant dream".[33] According to some researchers, noisy intermediate-scale
quantum (NISQ) machines may have specialized uses in the near future,
but noise in quantum gates limits their reliability.
Investment in quantum computing research has increased in the public and
private sectors.As one consulting firm summarized,... investment dollars are
pouring in, and quantum-computing start-ups are proliferating. ... While
quantum computing promises to help businesses solve problems that are
beyond the reach and speed of conventional high-performance computers, use
cases are largely experimental andhypothetical at this early stage.
With focus on business management’s point of view, the potential applications
of quantum computing into four major categories are cybersecurity, data
analytics and artificial intelligence, optimization and simulation, and data
management and searching.
In December 2023, physicists, for the first time, reported the entanglement of
individual molecules, which may have significant applications in quantum
computing. Also in December 2023, scientists at Harvard University
successfully created "quantum circuits" that correct errors more efficiently than
alternative methods, which may potentially remove a major obstacle to
practical quantum computers. The Harvard research team was supported
by MIT, QuEra Computing, Caltech, and Princeton University and funded
by DARPA's Optimization with Noisy Intermediate-Scale Quantum devices
(ONISQ) program. Research efforts are ongoing to jumpstart quantum
computing through topological and photonic approaches as well.

Quantum information processing

See also: Introduction to quantum mechanics


Computer engineers typically describe a modern computer's operation in terms
of classical electrodynamics. Within these "classical" computers, some
components (such as semiconductors and random number generators) may
rely on quantum behavior, but these components are not isolated from their
environment, so any quantum information quickly decoheres.
While programmers may depend on probability theory when designing
a randomized algorithm, quantum mechanical notions
like superposition and interference are largely irrelevant for program analysis.
Quantum programs, in contrast, rely on precise control of coherent quantum
systems. Physicists describe these systems mathematically using linear
algebra. Complex numbers model probability
amplitudes, vectors model quantum states, and matrices model the operations
that can be performed on these states. Programming a quantum computer is
then a matter of composing operations in such a way that the resulting
program computes a useful result in theory and is implementable in practice.
As physicist Charlie Bennett describes the relationship between quantum and
classical computers,[44]
A classical computer is a quantum computer ... so we shouldn't be asking about
"where do quantum speedups come from?" We should say, "well, all computers
are quantum. ... Where do classical slowdowns come from?"
Quantum information

Bloch sphere representation of a qubit. The


state is a point on the surface of the sphere, partway between the poles, and
Just as the bit is the basic concept of classical information theory, the qubit is
the fundamental unit of quantum information. The same term qubit is used to
refer to an abstract mathematical model and to any physical system that is
represented by that model. A classical bit, by definition, exists in either of two
physical states, which can be denoted 0 and 1. A qubit is also described by a
state, and two states often written and serve as the quantum counterparts of
the classical states 0 and 1. However, the quantum states and belong to
a vector space, meaning that they can be multiplied by constants and added
together, and the result is again a valid quantum state. Such a combination is
known as a superposition of and A two-dimensional vector mathematically
represents a qubit state. Physicists typically use Dirac notation for quantum
mechanical linear algebra, writing 'ket psi' for a vector labeled . Because a
qubit is a two-state system, any qubit state takes the form, where and are the
standard basis states,[b] and and are the probability amplitudes, which are in
general complex numbers. If either or is zero, the qubit is effectively a
classical bit; when both are nonzero, the qubit is in superposition. Such
a quantum state vector acts similarly to a (classical) probability vector, with
one key difference: unlike probabilities, probability amplitudes are not
necessarily positive numbers.[48] Negative amplitudes allow for destructive
wave interference.
/When a qubit is measured in the standard basis, the result is a classical bit.
The Born rule describes the norm-squared correspondence between amplitudes
and probabilities—when measuring a qubit the state collapses to with
probability,or to with probability . Any valid qubit state has coefficients and
such that As an example, measuring the qubit would produce either or with
equal probability.Each additional qubit doubles the dimension of the state
space.[47] As an example, the vector 1√2|00⟩ + 1√2|01⟩ represents a two-qubit
state, a tensor product of the qubit |0⟩ with the qubit 1√2|0⟩ + 1√2|1⟩. This
vector inhabits a four-dimensional vector space spanned by the basis vectors |
00⟩, |01⟩, |10⟩, and |11⟩. The Bell state 1√2|00⟩ + 1√2|11⟩ is impossible to
decompose into the tensor product of two individual qubits—the two qubits
are entangled because their probability amplitudes are correlated. In general,
the vector space for an n-qubit system is 2n-dimensional, and this makes it
challenging for a classical computer to simulate a quantum one: representing a
100-qubit system requires storing 2100 classical values.

Unitary operators
See also: Unitarity (physics)
The state of this one-qubit quantum memory can be manipulated by
applying quantum logic gates, analogous to how classical memory can be
manipulated with classical logic gates. One important gate for both classical
and quantum computation is the NOT gate, which can berepresented by
a matrix.Mathematically, the application of such a logic gate to a quantum
state vector is modelled with matrix multiplication. Thus.The mathematics of
single qubit gates can be extended to operate on multi-qubit quantum
memories in two important ways. One way is simply to select a qubit and apply
that gate to the target qubit while leaving the remainder of the memory
unaffected. Another way is to apply the gate to its target only if another part of
the memory is in a desired state. These two choices can be illustrated using
another example. The possible states of a two-qubit quantum memory
are.The controlled NOT (CNOT) gate can then be represented using the
following matrix:As a mathematical consequence of this definition,and . In
other words, the CNOT applies a NOT gate ( from before) to the second qubit
if and only if the first qubit is in the state . If the first qubit is , nothing is done
to either qubit.In summary, quantum computation can be described as a
network of quantum logic gates and measurements. However,
any measurement can be deferred to the end of quantum computation, though
this deferment may come at a computational cost, so most quantum
circuits depict a network consisting only of quantum logic gates and no
measurements.

Quantum parallelism
Quantum parallelism is the heuristic that quantum computers can be thought
of as evaluating a function for multiple input values simultaneously. This can
be achieved by preparing a quantum system in a superposition of input states,
and applying a unitary transformation that encodes the function to be
evaluated. The resulting state encodes the function's output values for all input
values in the superposition, allowing for the computation of multiple outputs
simultaneously. This property is key to the speedup of many quantum
algorithms. However, "parallelism" in this sense is insufficient to speed up a
computation, because the measurement at the end of the computation gives
only one value. To be useful, a quantum algorithm must also incorporate some
other conceptual ingredient.
Quantum programming

Further information: Quantum programming


There are a number of models of computation for quantum computing,
distinguished by the basic elements in which the computation is decomposed.
Gate array

Toffoli gate from more primitive gates


A quantum gate array decomposes computation into a sequence of few-
qubit quantum gates. A quantum computation can be described as a network
of quantum logic gates and measurements. However, any measurement can be
deferred to the end of quantum computation, though this deferment may come
at a computational cost, so most quantum circuits depict a network consisting
only of quantum logic gates and no measurements.
Any quantum computation (which is, in the above formalism, any unitary
matrix of size over qubits) can be represented as a network of
quantum logic gates from a fairly small family of gates. A choice of gate family
that enables this construction is known as a universal gate set, since a
computer that can run such circuits is a universal quantum computer. One
common such set includes all single-qubit gates as well as the CNOT gate from
above. This means any quantum computation can be performed by executing a
sequence of single-qubit gates together with CNOT gates. Though this gate set
is infinite, it can be replaced with a finite gate set by appealing to the Solovay-
Kitaev theorem. Implementation of Boolean functions using the few-qubit
quantum gates is presented here.[51]
Measurement-based quantum computing

A measurement-based quantum computer decomposes computation into a


sequence of Bell state measurements and single-qubit quantum gates applied
to a highly entangled initial state (a cluster state), using a technique
called quantum gate teleportation.
Adiabatic quantum computing
An adiabatic quantum computer, based on quantum annealing, decomposes
computation into a slow continuous transformation of an initial Hamiltonian into
a final Hamiltonian, whose ground states contain the solution.
Neuromorphic quantum computing
Neuromorphic quantum computing (abbreviated as ‘n.quantum computing’) is
an unconventional computing type of computing that uses neuromorphic
computing to perform quantum operations. It was suggested that quantum
algorithms, which are algorithms that run on a realistic model of quantum
computation, can be computed equally efficiently with neuromorphic quantum
computing. Both, traditional quantum computing and neuromorphic quantum
computing are physics-based unconventional computing approaches to
computations and don’t follow the von Neumann architecture. They both
construct a system (a circuit) that represents the physical problem at hand,
and then leverage their respective physics properties of the system to seek the
“minimum”. Neuromorphic quantum computing and quantum computing share
similar physical properties during computation
Topological quantum computing
A topological quantum computer decomposes computation into the braiding
of anyons in a 2D lattice.[53]
Quantum Turing machine
A quantum Turing machine is the quantum analog of a Turing machine.[7] All of
these models of computation—quantum circuits,[54] one-way quantum
computation,[55] adiabatic quantum computation,[56] and topological quantum
computation[57]—have been shown to be equivalent to the quantum Turing
machine; given a perfect implementation of one such quantum computer, it
can simulate all the others with no more than polynomial overhead. This
equivalence need not hold for practical quantum computers, since the
overhead of simulation may be too large to be practical.
Quantum cryptography and cybersecurity
Quantum computing has significant potential applications in the fields of
cryptography and cybersecurity. Quantum cryptography, which relies on the
principles of quantum mechanics, offers the possibility of secure
communication channels that are resistant to eavesdropping. Quantum key
distribution (QKD) protocols, such as BB84, enable the secure exchange of
cryptographic keys between parties, ensuring the confidentiality and integrity
of communication. Moreover, quantum random number generators (QRNGs)
can produce high-quality random numbers, which are essential for secure
encryption.
However, quantum computing also poses challenges to traditional
cryptographic systems. Shor's algorithm, a quantum algorithm for integer
factorization, could potentially break widely used public-key cryptography
schemes like RSA, which rely on the difficulty of factoring large numbers. Post-
quantum cryptography, which involves the development of cryptographic
algorithms that are resistant to attacks by both classical and quantum
computers, is an active area of research aimed at addressing this concern.
Ongoing research in quantum cryptography and post-quantum cryptography is
crucial for ensuring the security of communication and data in the face of
evolving quantum computing capabilities. Advances in these fields, such as the
development of new QKD protocols, the improvement of QRNGs, and the
standardization of post-quantum cryptographic algorithms, will play a key role
in maintaining the integrity and confidentiality of information in the quantum
era.[58]

Communication
Further information: Quantum information science
Quantum cryptography enables new ways to transmit data securely; for
example, quantum key distribution uses entangled quantum states to establish
secure cryptographic keys.[59] When a sender and receiver exchange quantum
states, they can guarantee that an adversary does not intercept the message,
as any unauthorized eavesdropper would disturb the delicate quantum system
and introduce a detectable change.With appropriate cryptographic protocols,
the sender and receiver can thus establish shared private information resistant
to eavesdropping.
Modern fiber-optic cables can transmit quantum information over relatively
short distances. Ongoing experimental research aims to develop more reliable
hardware (such as quantum repeaters), hoping to scale this technology to long-
distance quantum networks with end-to-end entanglement. Theoretically, this
could enable novel technological applications, such as distributed quantum
computing and enhanced quantum sensing.

Algorithms
Progress in finding quantum algorithms typically focuses on this quantum
circuit model, though exceptions like the quantum adiabatic algorithm exist.
Quantum algorithms can be roughly categorized by the type of speedup
achieved over corresponding classical algorithms.
Quantum algorithms that offer more than a polynomial speedup over the best-
known classical algorithm include Shor's algorithm for factoring and the related
quantum algorithms for computing discrete logarithms, solving Pell's equation,
and more generally solving the hidden subgroup problem for abelian finite
groups.[64] These algorithms depend on the primitive of the quantum Fourier
transform. No mathematical proof has been found that shows that an equally
fast classical algorithm cannot be discovered, but evidence suggests that this
is unlikely.[65] Certain oracle problems like Simon's problem and the Bernstein–
Vazirani problem do give provable speedups, though this is in the quantum
query model, which is a restricted model where lower bounds are much easier
to prove and doesn't necessarily translate to speedups for practical problems.
Other problems, including the simulation of quantum physical processes from
chemistry and solid-state physics, the approximation of certain Jones
polynomials, and the quantum algorithm for linear systems of equations have
quantum algorithms appearing to give super-polynomial speedups and
are BQP-complete. Because these problems are BQP-complete, an equally fast
classical algorithm for them would imply that no quantum algorithm gives a
super-polynomial speedup, which is believed to be unlikely.
Some quantum algorithms, like Grover's algorithm and amplitude amplification,
give polynomial speedups over corresponding classical algorithms.Though
these algorithms give comparably modest quadratic speedup, they are widely
applicable and thus give speedups for a wide range of problems. [21]
Simulation of quantum systems

Main article: Quantum simulation


Since chemistry and nanotechnology rely on understanding quantum systems,
and such systems are impossible to simulate in an efficient manner
classically, quantum simulation may be an important application of quantum
computing.[67] Quantum simulation could also be used to simulate the behavior
of atoms and particles at unusual conditions such as the reactions inside
a collider.[68] In June 2023, IBM computer scientists reported that a quantum
computer produced better results for a physics problem than a conventional
supercomputer.
About 2% of the annual global energy output is used for nitrogen fixation to
produce ammonia for the Haber process in the agricultural fertilizer industry
(even though naturally occurring organisms also produce ammonia). Quantum
simulations might be used to understand this process and increase the energy
efficiency of production.[71] It is expected that an early use of quantum
computing will be modeling that improves the efficiency of the Haber–Bosch
process[72] by the mid 2020s[73] although some have predicted it will take
longer.
Post-quantum cryptography
A notable application of quantum computation is for attacks on cryptographic
systems that are currently in use. Integer factorization, which underpins the
security of public key cryptographic systems, is believed to be computationally
infeasible with an ordinary computer for large integers if they are the product
of few prime numbers (e.g., products of two 300-digit primes).[75] By
comparison, a quantum computer could solve this problem exponentially faster
using Shor's algorithm to find its factors.[76] This ability would allow a quantum
computer to break many of the cryptographic systems in use today, in the
sense that there would be a polynomial time (in the number of digits of the
integer) algorithm for solving the problem. In particular, most of the
popular public key ciphers are based on the difficulty of factoring integers or
the discrete logarithm problem, both of which can be solved by Shor's
algorithm. In particular, the RSA, Diffie–Hellman, and elliptic curve Diffie–
Hellman algorithms could be broken. These are used to protect secure Web
pages, encrypted email, and many other types of data. Breaking these would
have significant ramifications for electronic privacy and security.
Identifying cryptographic systems that may be secure against quantum
algorithms is an actively researched topic under the field of post-quantum
cryptography.Some public-key algorithms are based on problems other than
the integer factorization and discrete logarithm problems to which Shor's
algorithm applies, like the McEliece cryptosystem based on a problem in coding
theory.
Lattice-based cryptosystems are also not known to be broken by quantum
computers, and finding a polynomial time algorithm for solving
the dihedral hidden subgroup problem, which would break many lattice based
cryptosystems, is a well-studied open problem. [80] It has been proven that
applying Grover's algorithm to break a symmetric (secret key) algorithm by
brute force requires time equal to roughly 2n/2 invocations of the underlying
cryptographic algorithm, compared with roughly 2 n in the classical case,
[81] meaning that symmetric key lengths are effectively halved: AES-256 would
have the same security against an attack using Grover's algorithm that AES-
128 has against classical brute-force search (see Key size).
Search problems
The most well-known example of a problem that allows for a polynomial
quantum speedup is unstructured search, which involves finding a marked item
out of a list of items in a database. This can be solved by Grover's
algorithm using queries to the database, quadratically fewer than
the queries required for classical algorithms. In this case, the advantage
is not only provable but also optimal: it has been shown that Grover's algorithm
gives the maximal possible probability of finding the desired element for any
number of oracle lookups. Many examples of provable quantum speedups for
query problems are based on Grover's algorithm, including Brassard, Høyer,
and Tapp's algorithm for finding collisions in two-to-one functions, [82] and Farhi,
Goldstone, and Gutmann's algorithm for evaluating NAND trees. [83]
Problems that can be efficiently addressed with Grover's algorithm have the
following properties:
1. There is no searchable structure in the collection of possible answers,
2. The number of possible answers to check is the same as the number of
inputs to the algorithm, and
3. There exists a Boolean function that evaluates each input and
determines whether it is the correct answer.
For problems with all these properties, the running time of Grover's algorithm
on a quantum computer scales as the square root of the number of inputs (or
elements in the database), as opposed to the linear scaling of classical
algorithms. A general class of problems to which Grover's algorithm can be
applied[86] is a Boolean satisfiability problem, where the database through
which the algorithm iterates is that of all possible answers. An example and
possible application of this is a password cracker that attempts to guess a
password. Breaking symmetric ciphers with this algorithm is of interest to
government agencies.[87]
Quantum annealing
Quantum annealing relies on the adiabatic theorem to undertake calculations.
A system is placed in the ground state for a simple Hamiltonian, which slowly
evolves to a more complicated Hamiltonian whose ground state represents the
solution to the problem in question. The adiabatic theorem states that if the
evolution is slow enough the system will stay in its ground state at all times
through the process. Adiabatic optimization may be helpful for
solving computational biology problems.[88]
Machine learning
Since quantum computers can produce outputs that classical computers cannot
produce efficiently, and since quantum computation is fundamentally linear
algebraic, some express hope in developing quantum algorithms that can
speed up machine learning tasks.
For example, the HHL Algorithm, named after its discoverers Harrow, Hassidim,
and Lloyd, is believed to provide speedup over classical counterparts.Some
research groups have recently explored the use of quantum annealing
hardware for training Boltzmann machines and deep neural networks.
Deep generative chemistry models emerge as powerful tools to expedite drug
discovery. However, the immense size and complexity of the structural space
of all possible drug-like molecules pose significant obstacles, which could be
overcome in the future by quantum computers. Quantum computers are
naturally good for solving complex quantum many-body problems [22] and thus
may be instrumental in applications involving quantum chemistry. Therefore,
one can expect that quantum-enhanced generative models [94] including
quantum GANs[95] may eventually be developed into ultimate generative
chemistry algorithms.

Engineering

wafer of adiabatic quantum computers


As of 2023, classical computers outperform quantum computers for all real-
world applications. While current quantum computers may speed up solutions
to particular mathematical problems, they give no computational advantage for
practical tasks. Scientists and engineers are exploring multiple technologies for
quantum computing hardware and hope to develop scalable quantum
architectures, but serious obstacles remain.
Challenges
There are a number of technical challenges in building a large-scale quantum
computer.[98] Physicist David DiVincenzo has listed these requirements for a
practical quantum computer:[99]
● Physically scalable to increase the number of qubits
● Qubits that can be initialized to arbitrary values
● Quantum gates that are faster than decoherence time
● Universal gate set
● Qubits that can be read easily.
Sourcing parts for quantum computers is also very difficult. Superconducting
quantum computers, like those constructed by Google and IBM, need helium-3,
a nuclear research byproduct, and special superconducting cables made only
by the Japanese company Coax Co.[100]
The control of multi-qubit systems requires the generation and coordination of
a large number of electrical signals with tight and deterministic timing
resolution. This has led to the development of quantum controllers that enable
interfacing with the qubits. Scaling these systems to support a growing number
of qubits is an additional challenge.[101]
Decoherence
One of the greatest challenges involved with constructing quantum computers
is controlling or removing quantum decoherence. This usually means isolating
the system from its environment as interactions with the external world cause
the system to decohere. However, other sources of decoherence also exist.
Examples include the quantum gates, and the lattice vibrations and
background thermonuclear spin of the physical system used to implement the
qubits. Decoherence is irreversible, as it is effectively non-unitary, and is
usually something that should be highly controlled, if not avoided.
Decoherence times for candidate systems in particular, the transverse
relaxation time T2 (for NMR and MRI technology, also called the dephasing
time), typically range between nanoseconds and seconds at low temperature.
[102] Currently, some quantum computers require their qubits to be cooled to
20 millikelvin (usually using a dilution refrigerator[103]) in order to prevent
significant decoherence.[104] A 2020 study argues that ionizing radiation such
as cosmic rays can nevertheless cause certain systems to decohere within
milliseconds.
As a result, time-consuming tasks may render some quantum algorithms
inoperable, as attempting to maintain the state of qubits for a long enough
duration will eventually corrupt the superpositions.
These issues are more difficult for optical approaches as the timescales are
orders of magnitude shorter and an often-cited approach to overcoming them
is optical pulse shaping. Error rates are typically proportional to the ratio of
operating time to decoherence time, hence any operation must be completed
much more quickly than the decoherence time.
As described by the threshold theorem, if the error rate is small enough, it is
thought to be possible to use quantum error correction to suppress errors and
decoherence. This allows the total calculation time to be longer than the
decoherence time if the error correction scheme can correct errors faster than
decoherence introduces them. An often-cited figure for the required error rate
in each gate for fault-tolerant computation is 10− 3, assuming the noise is
depolarizing.
Meeting this scalability condition is possible for a wide range of systems.
However, the use of error correction brings with it the cost of a greatly
increased number of required qubits. The number required to factor integers
using Shor's algorithm is still polynomial, and thought to be between L and L2,
where L is the number of digits in the number to be factored; error correction
algorithms would inflate this figure by an additional factor of L. For a 1000-bit
number, this implies a need for about 104 bits without error correction.
[107] With error correction, the figure would rise to about 10 7 bits. Computation
time is about L2 or about 107 steps and at 1 MHz, about 10 seconds. However,
the encoding and error-correction overheads increase the size of a real fault-
tolerant quantum computer by several orders of magnitude. Careful
estimates sw at least 3 million physical qubits would factor 2,048-bit integer in
5 months on a fully error-corrected trapped-ion quantum computer. In terms of
the number of physical qubits, to date, this remains the lowest estimate [110] for
practically useful integer factorization problem sizing 1,024-bit or larger.
Another approach to the stability-decoherence problem is to create
a topological quantum computer with anyons, quasi-particles used as threads,
and relying on braid theory to form stable logic gates.
Quantum supremacy
Physicist John Preskill coined the term quantum supremacy to describe the
engineering feat of demonstrating that a programmable quantum device can
solve a problem beyond the capabilities of state-of-the-art classical
computers. The problem need not be useful, so some view the quantum
supremacy test only as a potential future benchmark.
In October 2019, Google AI Quantum, with the help of NASA, became the first
to claim to have achieved quantum supremacy by performing calculations on
the Sycamore quantum computer more than 3,000,000 times faster than they
could be done on Summit, generally considered the world's fastest
computer. This claim has been subsequently challenged: IBM has stated that
Summit can perform samples much faster than claimed,and researchers have
since developed better algorithms for the sampling problem used to claim
quantum supremacy, giving substantial reductions to the gap between
Sycamore and classical supercomputers and even beating it.
In December 2020, a group at USTC implemented a type of Boson sampling on
76 photons with a photonic quantum computer, Jiuzhang, to demonstrate
quantum supremacy. The authors claim that a classical contemporary
supercomputer would require a computational time of 600 million years to
generate the number of samples their quantum processor can generate in 20
seconds.
Claims of quantum supremacy have generated hype around quantum
computing,[131] but they are based on contrived benchmark tasks that do not
directly imply useful real-world applications.
In January 2024, a study published in Physical Review Letters provided direct
verification of quantum supremacy experiments by computing exact
amplitudes for experimentally generated bitstrings using a new-generation
Sunway supercomputer, demonstrating a significant leap in simulation
capability built on a multiple-amplitude tensor network contraction algorithm.
This development underscores the evolving landscape of quantum computing,
highlighting both the progress and the complexities involved in validating
quantum supremacy claims.
Skepticism
Despite high hopes for quantum computing, significant progress in hardware,
and optimism about future applications, a 2023 Nature spotlight article
summarised current quantum computers as being "For now, [good for]
absolutely nothing".[96] The article elaborated that quantum computers are yet
to be more useful or efficient than conventional computers in any case, though
it also argued that in the long term such computers are likely to be useful. A
2023 Communications of the ACM article[97] found that current quantum
computing algorithms are "insufficient for practical quantum advantage without
significant improvements across the software/hardware stack". It argues that
the most promising candidates for achieving speedup with quantum computers
are "small-data problems", for example in chemistry and materials science.
However, the article also concludes that a large range of the potential
applications it considered, such as machine learning, "will not achieve quantum
advantage with current quantum algorithms in the foreseeable future", and it
identified I/O constraints that make speedup unlikely for "big data problems,
unstructured linear systems, and database search based on Grover's
algorithm".
This state of affairs can be traced to several current and long-term
considerations.
● Conventional computer hardware and algorithms are not only optimized
for practical tasks, but are still improving rapidly, particularly GPU accelerators.
● Current quantum computing hardware generates only a limited amount
of entanglement before getting overwhelmed by noise.
● Quantum algorithms provide speedup over conventional algorithms only
for some tasks, and matching these tasks with practical applications proved
challenging. Some promising tasks and applications require resources far
beyond those available today.In particular, processing large amounts of non-
quantum data is a challenge for quantum computers.
● Some promising algorithms have been "dequantized", i.e., their non-
quantum analogues with similar complexity have been found.
● If quantum error correction is used to scale quantum computers to
practical applications, its overhead may undermine speedup offered by many
quantum algorithms.
● Complexity analysis of algorithms sometimes makes abstract
assumptions that do not hold in applications. For example, input data may not
already be available encoded in quantum states, and "oracle functions" used
in Grover's algorithm often have internal structure that can be exploited for
faster algorithms.
In particular, building computers with large numbers of qubits may be futile if
those qubits are not connected well enough and cannot maintain sufficiently
high degree of entanglement for long time. When trying to outperform
conventional computers, quantum computing researchers often look for new
tasks that can be solved on quantum computers, but this leaves the possibility
that efficient non-quantum techniques will be developed in response, as seen
for Quantum supremacy demonstrations. Therefore, it is desirable to prove
lower bounds on the complexity of best possible non-quantum algorithms
(which may be unknown) and show that some quantum algorithms
asymptomatically improve upon those bounds.
Conclusion:
Some researchers have expressed skepticism that scalable quantum
computers could ever be built, typically because of the issue of maintaining
coherence at large scales, but also for other reasons.
Bill Unruh doubted the practicality of quantum computers in a paper published
in 1994.[136] Paul Davies argued that a 400-qubit computer would even come
into conflict with the cosmological information bound implied by
the holographic principle.[137] Skeptics like Gil Kalai doubt that quantum
supremacy will ever be achieved.Physicist Mikhail Dyakonov has expressed
skepticism of quantum computing as follows:"So the number of continuous
parameters describing the state of such a useful quantum computer at any
given moment must be... about 10300... Could we ever learn to control the
more than 10300 continuously variable parameters defining the quantum state
of such a system? My answer is simple. No, never."

You might also like