0% found this document useful (0 votes)
13 views6 pages

Moors Law Report (1)

Moore's Law, named after Gordon Moore, predicts that the number of transistors on a microchip doubles approximately every two years, leading to exponential growth in computer processing power. The document outlines the history of computers, highlighting key developments such as the invention of the transistor and microchip, which significantly reduced computer size and cost. As the industry faces physical limitations in miniaturization, future advancements may rely on quantum computing rather than further shrinking transistors.

Uploaded by

ha.almouslem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views6 pages

Moors Law Report (1)

Moore's Law, named after Gordon Moore, predicts that the number of transistors on a microchip doubles approximately every two years, leading to exponential growth in computer processing power. The document outlines the history of computers, highlighting key developments such as the invention of the transistor and microchip, which significantly reduced computer size and cost. As the industry faces physical limitations in miniaturization, future advancements may rely on quantum computing rather than further shrinking transistors.

Uploaded by

ha.almouslem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Moore’s Law

The Chage of Technology Over 2 Years

BY HAMZA OMAR 19/09/2024


THE HISTORY OF COMPUTORS
The history of computers goes back over 200 years. First theorized by
mathematicians and entrepreneurs, during the 19th century mechanical calculating
machines were designed and built to solve the increasingly complex number-
crunching challenges. The advancement of technology enabled ever more complex
computers by the early 20th century, and computers became larger and more
powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such
as Charles Babbage's Analytical Engine — or even from the huge computers of the
20th century that occupied whole rooms, such as the Electronic Numerical Integrator
and Moore accurately predicted the development of the microprocessors at the
heart of our digital devices many decades into the future.

The data from the graph is as follows:

Processor Year Number of transistors

4004 1971 2250


8008 1972 2500
8080 1974 5000

8086 1978 29,000


286 1982 120,000
386TM processor 1985 275,000
486 TM processor 1989 1,180,000
Pentium® processor 1993 3,100,000
Pentium® II processor 1997 7,500,000
Pentium® III processor 1999 24,000,000
Pentium® 4 processor 2000 42,000,000
Xeon® processor 2008 1,900,000,000

THE LONG INTERESTING HISTORY OF CUMPUTORS


The first computers, such as ENIAC, took up so much space that they filled large rooms. Yet their processing power
was tiny when compared with a modern computer.
The first big reduction in the size of electronic computers came when the large valves that processed data were
replaced by much smaller transistors. The next significant reduction in size came in 1959 when Jack Kilby and
Robert Noyce invented the microchip.

Invented in the late 1940s, the transistor reduced the size of computer circuits and brought enormous
improvements in reliability. The inventors of the transistor – William Bradford Shockley, John Bardeen, and Walter
Houser Brattain – were awarded the Nobel Prize in Physics in 1956 for their discovery of the transistor effect.

Gordon Moore was born in 1929. He is an American businessperson, engineer and the co-founder and chair
emeritus of Intel Corporation. Moore’s law is named after him.

Moore’s Law refers to Moore’s realization, in 1975, that the number of transistors on a microchip doubles every two
years, so he predicted that the growth in the processing power of the computer would be exponential.

Transistors are small electronic components that act like a switch. They are used to build the gates that process
data inside a computer. The more transistors there are, the faster the data can be processed. Computers in the
early 1970s had thousands of transistors. Today’s computers have billions of transistors but, due to miniaturization,
their central processing units (CPUs) are about the same size as the chips in computers of the early 1970s.

The invention of the microchip by Jack Kilby and Robert Noyce in the late 1950s paved the way for the dramatic
reductions in size of CPUs inside a computer. The microchip miniaturized the transistor, making it possible to fit
large numbers of transistors into an exceedingly small space. The invention of the microchip was one of the most
important inventions ever made and it launched a latest information age.

Moore also pointed out that the cost of a personal computer was not increasing over time and, therefore, the cost of
processing power was halved every two years.

Moore did not set out to predict the future and yet his observation of the trend that lies behind the law that is
named after him enabled people to predict the processing power and the probable cost of computers in the future.
Time has shown that Moore’s predictions were remarkably accurate.

We might expect Moore’s law to continue forever, but that is not physically possible. To fit more transistors on a
microchip means making the transistors smaller and smaller. At the time of writing, the world’s smallest-ever
working transistor is 1 nanometer long. Is it possible to make a smaller transistor or has the minimum size possible
now been reached?

As microchips became more powerful, with an increase in the number of transistors built into them, it also became
possible to make them smaller. This meant that computers that depended on them could also become smaller.
Computers did indeed shrink!

The end of Moore’s Law? The chip industry has acknowledged that it is no longer economical to reduce the size of
silicon transistors further. Instead, microchips look set to change in different ways. The way microchips are
designed and integrated into computer circuits has not changed very much over the years and so it should be
possible to make improvements in processing speed by redesigning them and changing the way they are
integrated without further increasing the number of transistors on the processor.

The next generation of computers will not process data in the same way as our current digital devices. They will use
quantum computation. These new generation computers are called quantum computers, and they will dramatically
increase the speed at which computers of the future will process data.
About Gorden Moore

Mr Moore started working on semiconductors in the 1950s and co-founded


the Intel Corporation. He famously predicted that computer processing
powers would double every year - later revised to every two - an insight
known as Moore's Law. That "law" became the bedrock for the computer
processor industry and influenced the PC revolution.

You might also like