0% found this document useful (0 votes)
20 views

What Is Quantum Computing - How Does It Work and Examples

What is Quantum Computing_ How Does it Work and Examples

Uploaded by

am.brinosa1996
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

What Is Quantum Computing - How Does It Work and Examples

What is Quantum Computing_ How Does it Work and Examples

Uploaded by

am.brinosa1996
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

9/14/24, 12:32 AM What is Quantum Computing?

How Does it Work and Examples

Home > Internet technologies

DEFINITION

quantum computing
By Alexander S. Gillis, Technical Writer and Editor

What is quantum computing?


Quantum computing is an area of computer science focused on the development of computers based on the principles of quantum theory.
Quantum computing uses the unique behaviors of quantum physics to solve problems that are too complex for classical computing.

Quantum computers work by taking advantage of quantum mechanical properties like superposition and quantum interference. They use
special hardware and algorithms that can take advantage of these quantum effects.

The development of quantum computers marks a leap forward in computing capability, with the potential for massive performance gains in
specific use cases. For example, quantum computing offers advantages for tasks such as integer factorization and simulations and has the
potential to impact industries such as pharmaceuticals, healthcare, manufacturing, cybersecurity and finance.

Although quantum computing is a rapidly emerging technology, it could be a disruptive technology once it reaches maturity. Quantum
computing companies are popping up all over the world, but experts estimate that it could take several years before quantum computing
delivers practical benefits.

9 Numerous companies, national labs and government agencies worldwideWhatIs 


are developing quantum computing technology. This includes g
U.S.-based tech giants, such as Amazon, Google, Hewlett Packard Enterprise, Hitachi, IBM, Intel and Microsoft as well as the
Massachusetts Institute of Technology, Oxford University and the Los Alamos National Laboratory. Other countries -- including the U.K.,
Australia, Canada, China, Germany, Israel, Japan and Russia -- have made significant investments in quantum computing technologies. In
2020, the Indian government introduced its National Mission on Quantum Technologies and Applications. That same year, Germany
announced a €2 billion investment in quantum technologies. Many countries, including the U.S., continue to push for innovation in the
quantum computing space.

In 2011, D-Wave Systems released the first commercially available quantum computer. In 2019, IBM released the Quantum System One
and in December 2023 it introduced the Quantum System Two. In October 2023, Atom Computing was the first to exceed over 1,000 qubits,
with IBM closely following behind with its Condor quantum processor. This count indicates the capacity of a quantum computer -- as the
number of qubits indicates computation performance.

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 1/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples

Quantum Computing: 4 Things You Need to Know

How does quantum computing work?


Quantum theory explains the nature and behavior of energy and matter on the quantum, or atomic and subatomic levels. Quantum
computing takes advantage of how quantum matter works: Where classical computing uses binary bits -- 1s and 0s -- quantum computing
uses 1s, 0s, and both a 1 and 0 simultaneously. The quantum computer gains much of its processing power because bits can be in multiple
states at the same time.

Quantum computers are composed of an area that houses qubits, the method that transfers signals to qubits, and a classical computer that
runs a program and sends instructions.

A qubit, or quantum bit, is equivalent to a bit in classical computing. Just as a bit is the basic unit of information in a classical computer, a
qubit is the basic unit of information in a quantum computer. Quantum computers use particles such as electrons or photons that are given
either a charge or polarization to act as a 0, 1 or both a 0 and 1. The two most relevant aspects of quantum physics are the principles of
superposition and entanglement.

Superposition refers to placing the quantum information a qubit holds into a state of all possible configurations, while entanglement refers to
one qubit directly changing another.

Quantum computers tend to be resource-intensive and require a significant amount of energy and cooling to run properly. Quantum
computing hardware is mostly composed of cooling systems that keep a superconducting processor at a specific super-cooled temperature.
A dilution refrigerator, for example, can be used as a coolant that keeps the temperature in a milli-kelvin (mK) range. As an example, IBM
has used this coolant fluid to keep its quantum-ready system to about 25 mK, which is comparable to -459 degrees Fahrenheit. At this
super-low temperature, electrons can flow through superconductors, which create electron pairs.

Features of quantum computing


Quantum computers are designed to perform complex calculations with huge amounts of data using the following features:

Superposition. Superposition refers to qubits that are in all configurations at once. Think of a qubit as an electron in a magnetic field. The
electron's spin might be either in alignment with the field, known as a spin-up state, or opposite to the field, known as a spin-down state.
Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from a laser. If only half a unit of
laser energy is used, and the particle is isolated from all external influences, it enters a superposition of states. The particle behaves as if it
were in both states simultaneously.

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 2/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples
Since qubits take a superposition of 0 and 1, this means the number of computations a quantum computer could undertake is 2^n, where n
is the number of qubits used. A quantum computer comprised of 500 qubits has the potential to do 2^500 calculations in a single step.

Entanglement. Entanglement particles are entangled pairs of qubits that exist in a state where changing one qubit directly changes the
other. Knowing the spin state of one entangled particle -- up or down -- gives away the spin of the other in the opposite direction. In addition,
because of the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being
measured is determined at the time of measurement and communicated to the connected particle, which simultaneously assumes the
opposite spin direction.

Quantum entanglement enables qubits separated by large distances to interact with each other instantaneously. No matter how great the
distance between the correlated particles is, they remain entangled as long as they're isolated.

Quantum superposition and entanglement together create enormously enhanced computing power. If more qubits are added, the increased
capacity is expanded exponentially.

What is quantum theory?


Development of quantum theory began in 1900 with a presentation by German physicist Max Planck to the German Physical Society.
Planck introduced the idea that energy and matter exist in individual units. Further developments by several scientists over the following 30
years have led to the modern understanding of quantum theory.

The elements of quantum theory include the following:

Energy, like matter, consists of discrete units -- as opposed to a continuous wave.

Elementary particles of energy and matter, depending on the conditions, might behave like particles or waves.

The movement of elementary particles is inherently random and, thus, unpredictable.

The simultaneous measurement of two complementary values -- such as the position and momentum of a particle -- is flawed. The more
precisely one value is measured, the more flawed the measurement of the other value will be.

Types of quantum technology


As quantum technologies are still in their infancy, there are also other quantum-enabled technologies that could become viable. These
technologies include the following:

Quantum cryptography. This method of encryption uses the naturally occurring properties of quantum mechanics to secure and
transmit. Quantum cryptography is different from traditional cryptographic systems in that it relies on physics, rather than mathematics,
as the key aspect of its security model.

Quantum processing. Quantum processing is currently achievable through several different technologies. This includes gate-based ion
trap processors, gate-based superconducting processors, photonic processors, neutral atom processors, Rydberg atom processors and
quantum annealers. These technologies all use properties of quantum physics but differ in makeup and implementation.

Quantum sensing. This is a process of collecting data at the atomic level by using sensor technology that can detect changes in
motion, electric and magnetic fields. Quantum sensing has been used in magnetic resonance imaging, or MRIs, and provides faster
results and improvements in resolution quality.

Uses and benefits of quantum computing


Quantum computing has the potential to offer the following benefits:

Speed. Quantum computers are incredibly fast compared to classical computers. For example, quantum computing has the potential to
speed up financial portfolio management models, such as the Monte Carlo model for gauging the probability of outcomes and their
associated risks.

Ability to solve complex processes. Quantum computers are designed to perform multiple complex calculations simultaneously. This
can be particularly useful for factorizations, which could help develop decryption technologies.

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 3/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples
Simulations. Quantum computers can run complex simulations. They're fast enough to be used to simulate more intricate systems than
classical computers. For example, this could be helpful for molecular simulations, which are important in prescription drug development.

Optimization. With quantum computing's ability to process huge amounts of complex data, it has the potential to transform artificial
intelligence (AI) and machine learning (ML).

How companies use quantum computers


Quantum computers have the potential to disrupt multiple areas where classical computers are in use today. For example, organizations
can use quantum computers for the following:

ML and AI. Quantum computers can aid in the AI and ML process by improving the handling of complex, high-dimensional data,
processing large data sets, and improving optimization as well as feature extraction and data representation features.

Simulations. Quantum computers are usable for their computational efficiency in the simulation of complex systems. For example, the
technology is usable in chemistry and biomedical fields to simulate the behavior of molecules and to simulate interactions between
drugs.

Optimization of business processes. Quantum computers can be used to improve research and development, supply-chain
optimization and production processes.

Cryptography. Quantum cryptography uses the principles of quantum mechanics to secure communication systems. A popular
application of this process is quantum key distribution.

Prime factorization to break traditional encryption. Organizations today use traditional computers to encrypt their data. They use
large, complex prime numbers to encrypt data, which typically are too large for classical computers to process. Quantum computers can
factor extremely large numbers, meaning that they can effectively break current forms of encryption.

Limitations of quantum computing


Although the benefits of quantum computing are promising, there are still huge obstacles to overcome, including the following:

Interference. The slightest disturbance in a quantum system can cause a quantum computation to collapse -- a process known as
decoherence. A quantum computer must be totally isolated from all external interference during the computation phase. Some success
has been achieved with the use of qubits in intense magnetic fields.

Error correction. Qubits aren't digital bits of data and can't use conventional error correction. Error correction is critical in quantum
computing, where even a single error in a calculation can cause the validity of the entire computation to collapse. There has been
considerable progress in this area, however, with an error correction algorithm developed that uses nine qubits -- one computational and
eight correctional. A system from IBM can make do with a total of five qubits -- one computational and four correctional.

Output observance. Retrieving output data after a quantum calculation is complete risks corrupting the data. Developments such as
database search algorithms that rely on the special wave shape of the probability curve in quantum computers can avoid this issue. This
ensures that once all calculations are performed, the act of measurement sees the quantum state decohere into the correct answer.

There are other problems to overcome as well, such as how to handle security and quantum cryptography. Long-time quantum information
storage has also been a problem in the past. However, recent breakthroughs have made some form of quantum computing practical.

Can you emulate quantum computers?


Quantum computers can be emulated, but they're slow and impractical for normal quantum computer use cases. The vectors needed to
simulate a qubit can become rather large, and to perform a single computational step, that vector must be multiplied. Because of this,
classical computers can only effectively simulate quantum computers to a point. A quantum program that has only a few qubits can be
simulated, but anything above around 50 qubits would require a real quantum computer.

Quantum simulators do exist and serve a purpose, however. Quantum simulators are software programs that enable classical computers to
emulate and run quantum circuits as if they were on a quantum computer. These simulators serve a supporting role, as they can provide
faster feedback while developing quantum algorithms.

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 4/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples
Because observing the process changes the outcome, typical quantum systems are hard to debug. Quantum simulators, however, let
developers observe a computation while it's being run, making the debugging process faster. Likewise, these systems are less expensive to
run.

Some examples of quantum simulators include qsims, or Quantum Simulation Software; the Quantum Computer Emulator, or QCE; and
jQuantum.

A comparison of classical and quantum computing


Classical computing relies on principles expressed by Boolean algebra, usually operating on a logic gate principle. Data must be processed
in an exclusive binary state at any point in time -- either 0 for off or 1 for on. These values are bits. The millions of transistors and capacitors
at the heart of computers can only be in one state at any point. There's also still a limit as to how quickly these devices can be made to
switch states.

By comparison, quantum computers operate with a two-mode logic gate -- XOR and a mode called QO1-- which lets them change 0 into a
superposition of 0 and 1. In a quantum computer, particles such as electrons or photons, can be used. Each particle is given a charge, or
polarization, acting as a representation of 0 and 1. Each particle is referred to as a quantum bit, or qubit. The nature and behavior of these
particles form the basis of quantum computing and quantum supremacy.

Chart comparing classical and quantum computing.

k There are numerous differences between classical and quantum computing.

Cloud-based quantum computing


Quantum computers can cost millions of dollars, depending on the machine and the number of qubits it can handle. In 2024, Chinese
company SpinQ Technology offered a compact version of its SpinQ Gemini for $50,000. However, this system can only operate using 2
qubits.

Because of the largely unrealistic cost of quantum computers today, organizations can instead opt to access a cloud-based service that
provides quantum computer technology through a third party. Referred to as quantum as a service (QaaS), this model lets users access and
use quantum resources over the internet, eliminating the need for them to own a quantum computer.

QaaS services offer numerous benefits to the organizations that use them, such as enabling remote access to quantum hardware, cost
efficiency, scalability and integration with classical computing systems.

Examples of quantum computers


Although the idea of using a quantum computer can be exciting, it's unlikely that most organizations will build or buy one. Instead, they
might opt to use cloud-based services that enable remote access.

The following are examples of customer-oriented cloud-based quantum computing platforms. These QaaS vendors are listed in alphabetical
order; note that this list isn't exhaustive:

Amazon Braket. Amazon Braket is a fully managed AWS cloud service designed to provide quantum computer users with remote
access to a single development environment. It provides users access to multiple types of quantum computers through the Amazon
Braket Python software development kit (SDK) or other developer frameworks.

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 5/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples
D-Wave Leap. Customers can use the Leap quantum cloud service to build applications for drug discovery, financial services, logistics,
supply chain management and manufacturing. D-Wave Leap gives users access to the company's Advantage quantum computer and
services.

Google Quantum AI. Google Quantum AI is the quantum computing research branch of Google AI. The initiative focuses on designing
quantum algorithms, hardware and software and providing open source tools.

IBM Quantum. IBM Quantum is IBM's quantum computing service offering that includes access to quantum processors, software tools
and educational resources. It also includes access to Qiskit, an open source quantum computing SDK.

Microsoft Azure Quantum. The Microsoft Azure Quantum cloud-based platform gives users access to quantum computing resources
such as quantum development kits and enables them to develop, test and run quantum algorithms.

Quantinuum Origin. Quantinuum Origin is a quantum computing platform developed by Honeywell Quantum Solutions and Cambridge
Quantum. This service is used for quantum cryptography.

Xanadu Cloud. Xanadu Cloud is a quantum computing platform that includes the hardware, software and applications to enable users
to run quantum algorithms and perform quantum simulations.

Like any emerging technology, quantum computing offers opportunities and risks. Learn how quantum computing compares to classical
computing.
This was last updated in June 2024

∙∙
m Continue Reading About quantum computing
U.S. weighs National Quantum Initiative Reauthorization Act

∙∙
Is quantum computing overhyped?

Where AI and quantum computing meet


IBM quantum computers make sizable leap

Quantum computing: What are the challenges?

Related Terms
What is a unique identifier (UID)?
A unique identifier (UID) is a numeric or alphanumeric string that is associated with a single entity within a given system. See complete definitionq

What is a URL (Uniform Resource Locator)?


A URL (Uniform Resource Locator) is a unique identifier used to locate a resource on the internet. See complete definitionq

What is LDAP (Lightweight Directory Access Protocol)?


LDAP (Lightweight Directory Access Protocol) is a software protocol used for locating data about organizations, individuals and ... See complete definitionq

-ADS BY GOOGLE

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 6/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples

Latest TechTarget Networking


resources

NETWORKING
A 2 What is asynchronous?
In general, asynchronous -- from Greek asyn- ('not with/together') and chronos
('time') -- describes objects or events not ...

SECURITY

CIO
2 What is a URL (Uniform Resource Locator)?
A URL (Uniform Resource Locator) is a unique identifier used to locate a resource
on the internet.
HR SOFTWARE

CUSTOMER EXPERIENCE

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 7/8
9/14/24, 12:32 AM What is Quantum Computing? How Does it Work and Examples

Browse by Topic Browse Resources

About Us Meet The Editors Editorial Ethics Policy Contact Us Advertisers Business Partners Events Media Kit Corporate Site Reprints

All Rights Reserved, Copyright 1999 - 2024, TechTarget

Privacy Policy
Do Not Sell or Share My Personal Information

https://www.techtarget.com/whatis/definition/quantum-computing?fbclid=IwY2xjawFRSFRleHRuA2FlbQIxMAABHewe9QCkD06DSyrNsCfxzXk8IYY53… 8/8

You might also like