Addis Ababa University Department of Law Fresh Man Course
Addis Ababa University Department of Law Fresh Man Course
DEPARTMENT OF LAW
GROUP ASSIGNMENT
1
CONTENT
INTRODUCTION:..........................................................................................................................1
SUMMERY.....................................................................................................................................9
REFERANCE..................................................................................................................................9
2
INTRODUCTION:
A Quantum Computer is a processor whose function is based on the laws of quantum mechanics.
This rapidly evolving technology will provide faster computing solutions to problems that only
supercomputers can currently solve – or are considered unsolvable at the current state of the art.
Quantum computing uses qubit classical as computing or normal computing uses bits. Qubit is
the short form for Quantum bit. As the basis of Quantum Computing, Qubits can – in contrast to
bits, which can only be represented by a zero or a one – be a zero, a one or both at the same time
and form a so-called superposition.
Today, standard computers process information in bits, whereas quantum computers use Qubits
and thus use the concept of superposition. This can lead to extremely fast processing so that
calculations that are not possible with conventional computers become reality. We will discuss
about its advantage in different area.
Quantum computing doesn’t provide efficient solutions to all problems. Strong limitations makes
it not to do much as classical computers. We will see about its limitation and disadvantage detail.
1
HISTORY OF QUANTUM COMPUTING
Quantum computing began in 1980 when physicist Paul Benioff proposed a quantum mechanical
model of Turing machine, Richard Feynman and Yuri Manin later suggested that a quantum
computer had the potential to simulate things a classical computer could not feasibly do. In 1986
Feynman introduced early version of quantum circuit notation, then quantum algorithm in 1994
by Peter Shor to find prime factor of integer. In 1998 the first 2-qubit quantum computer
developed by Isaac Chuang, Neil Gershenfeld and Mark Kubinec. In recent years investment in
quantum research has increased in public and private sectors. After a few years IBM become the
first to offer cloud based quantum computing access in 2016. On October 23, 2019 Google with
NASA announced a quantum computer which was infeasible on any classical computers.
The basic unit of classical computing is bit. Bit can be represented by anything that can be in one
of two possible states. The standard example is an electrical switch that can be either on or off
and the unit of quantum computing is quantum bit or qubit.
The basic properties of quantum computing are superposition, entanglement and interference.
Superposition is like spinning a coin. You can’t tell whether it is head or tell while it’s
spinning. It is the ability of a quantum system to be in more than one state at the same
time until it’s measured and it makes it to have large memory because the more
combination the more memory it has.
Entanglement is also known as an emergent property. When two particles such as a
pair of proton or electron, become entangled they remain connected even when
separated by vast distances.
2
There are two types of interference. Constructive interference is when then amplitudes
are added until the signal gets larger and the second one is destructive interference and
it is when amplitudes cancel by using a property like interference. We can control
quantum states and amplify the kinds of signals that are towards the right answer and
then cancel the type of signals that are heading to wrong answer.
Qubits are quantum bits, and have the special property that at the same time they can be zero and
one. The classical computer can only be like light switch either on or off and the quantum bits
can be on and off as the same time when we see the effect of that quantum bits.
Superposition essentially allows it to do things that a classical computer can’t do. Because it’s in
man’s states at the same time, in simplified terms, it allows you to probe many possibilities at the
same time.
Example:- if you are working in finance and you want to say which portfolio has the largest
profit, you need to take mainly, man’s different cases into account and then find the best one, and
this is something that a quantum computer, because it essentially allows you to calculate many
things at the same time, is notably more suitable for.
The other one quantum differ from classic is much faster and more powerful by using
entanglement. This more powerful and faster can be big advantage for like in medicine to save
patient quickly and also search many thing easily at the same times are quantum computing
differ from classic computing.
3
4
ADVANTAGE OF QUANTUM COMPUTING
Quantum computing have a great role in climate change by reducing emissions in some of the
most challenging or emissions intensive areas, improve climate-related decision making and
provide long term solutions for a better environment. The other amusing thing about quantum
computing is that it can reduce power consumption anywhere from 10 up to 1000 times because
quantum computing use tunneling.
Quantum computing could also contribute great role in the fields of security, finance, military
affairs and intelligence, drug design and discovery of aerospace designing, utilities, polymer
design, machine learning, artificial intelligence and big data search.
One of the biggest hurdles for artificial intelligence today is training the machine to do
something useful. For example, we might have a model that can correctly identify a dog in a
photo. But the model will need to be trained with tens of thousands of images for it to
recognize the subtle differences between a beagle, a poodle, and a Great Dane. This process is
what AI researchers call “training”. They use it to teach AI algorithms to make predictions in
new situations Quantum computing can make this training process faster and more accurate. It
will allow AI researchers to use more data than they have ever used before. It can process large
amounts of data in 1’s and 0’s and the combination thereof — which will enable quantum
computers to come to more accurate conclusions than traditional computers. In other words,
AI researchers can use larger datasets to train AI models to be more accurate and better at
decision-making.
5
The application of quantum computing to healthcare and life sciences is expected to transform
computational medical science. The US federal government has signaled that it is committed to a
quantum-enabled future, and is significantly supporting this emerging field. Exciting applications
of quantum technology include the diagnosis of diseases, design of pharmaceuticals, strategies
for personalized medical interventions, and analysis of medical images. Below, we’ll explore
some of these promising applications, establishing how today’s research in quantum algorithms
could translate into tomorrow’s expanded treatment options for doctors and better outcomes for
patients.
Financial institutions may be able to use quantum computing to design more effective
and efficient investment portfolios for retail and institutional clients. They could focus
on creating better trading simulators and improve fraud detection.
The healthcare industry could use quantum computing to develop new drugs and
genetically-targeted medical care. It could also power more advanced DNA research.
For stronger online security, quantum computing can help design better data encryption
and ways to use light signals to detect intruders in the system.
Quantum computing can be used to design more efficient, safer aircraft and traffic
planning systems.
6
Generally the potential of quantum computing is limitless, but its integration into artificial
intelligence will produce a technology that will be rather powerful than anything we have today.
The new technology will enable machines to learn and self-evolve. It will make them
exponentially better at solving complex problems and developing self-learning algorithms that
will drive efficiency in sectors such as finance or healthcare.
Even though Quantum computing was developed between 1900 and 1925 and it remains the
cornerstone on which chemistry condensed matter, physics and technologies ranging from
computer chips to LED lighting, it has some limitations like
Because qubits can be any combination of one and zero it cannot reject small noises
as bits.
Because quantum computers must be protected from all external interference,
Decoherence or decay can be caused by slightest disturbance in qubit environment.
A lack of qubit prevents quantum computer from living up to their potential for
impactful use.
Large data inputs cannot be loaded in to quantum computers efficiently. To create
the input in quantum state would typically dominate the computation time and
greatly reduce the quantum advantage.
7
wave One with 50 qubit cost 10,000,000 dollar, but China based Shenzhen SpinQ
technology plans to sell a 5,000 desktop quantum computer.
Even though they are so expensive and hard to build, world’s leading company like Google,
IBM, Microsoft and others used to build it in large budget. For example Google is spending
billions of dollars to build its quantum computer by 2029. And Google is the leading company
by having 72-qubit machine currently.
GOOGLE SYCAROM.
IBM also trying to have a 1000 qubit quantum computer in 2023, but it already built its own
machine which are available for research organization, universities and laboratories that are part
of its quantum network.
8
IBM Q QUANTUM SYSTEM.
9
SUMMERY
Quantum computing is an area of study focused on the development of computer based
technologies center around principles of quantum theory and use qubit as bit. It expected to be
the next generation computing system. Even though it is not that much applicable currently
because of some limitation, it will change the world around us radically by revolutionizing
industries such as finance, pharmaceuticals, Artificial Intelligence and automotive over the next
several years.
REFERANCE
o Module
o Google Wikipedia and investopedia
10