Moors Law Report (1)
Moors Law Report (1)
Today, computers are almost unrecognizable from designs of the 19th century, such
as Charles Babbage's Analytical Engine — or even from the huge computers of the
20th century that occupied whole rooms, such as the Electronic Numerical Integrator
and Moore accurately predicted the development of the microprocessors at the
heart of our digital devices many decades into the future.
Invented in the late 1940s, the transistor reduced the size of computer circuits and brought enormous
improvements in reliability. The inventors of the transistor – William Bradford Shockley, John Bardeen, and Walter
Houser Brattain – were awarded the Nobel Prize in Physics in 1956 for their discovery of the transistor effect.
Gordon Moore was born in 1929. He is an American businessperson, engineer and the co-founder and chair
emeritus of Intel Corporation. Moore’s law is named after him.
Moore’s Law refers to Moore’s realization, in 1975, that the number of transistors on a microchip doubles every two
years, so he predicted that the growth in the processing power of the computer would be exponential.
Transistors are small electronic components that act like a switch. They are used to build the gates that process
data inside a computer. The more transistors there are, the faster the data can be processed. Computers in the
early 1970s had thousands of transistors. Today’s computers have billions of transistors but, due to miniaturization,
their central processing units (CPUs) are about the same size as the chips in computers of the early 1970s.
The invention of the microchip by Jack Kilby and Robert Noyce in the late 1950s paved the way for the dramatic
reductions in size of CPUs inside a computer. The microchip miniaturized the transistor, making it possible to fit
large numbers of transistors into an exceedingly small space. The invention of the microchip was one of the most
important inventions ever made and it launched a latest information age.
Moore also pointed out that the cost of a personal computer was not increasing over time and, therefore, the cost of
processing power was halved every two years.
Moore did not set out to predict the future and yet his observation of the trend that lies behind the law that is
named after him enabled people to predict the processing power and the probable cost of computers in the future.
Time has shown that Moore’s predictions were remarkably accurate.
We might expect Moore’s law to continue forever, but that is not physically possible. To fit more transistors on a
microchip means making the transistors smaller and smaller. At the time of writing, the world’s smallest-ever
working transistor is 1 nanometer long. Is it possible to make a smaller transistor or has the minimum size possible
now been reached?
As microchips became more powerful, with an increase in the number of transistors built into them, it also became
possible to make them smaller. This meant that computers that depended on them could also become smaller.
Computers did indeed shrink!
The end of Moore’s Law? The chip industry has acknowledged that it is no longer economical to reduce the size of
silicon transistors further. Instead, microchips look set to change in different ways. The way microchips are
designed and integrated into computer circuits has not changed very much over the years and so it should be
possible to make improvements in processing speed by redesigning them and changing the way they are
integrated without further increasing the number of transistors on the processor.
The next generation of computers will not process data in the same way as our current digital devices. They will use
quantum computation. These new generation computers are called quantum computers, and they will dramatically
increase the speed at which computers of the future will process data.
About Gorden Moore