Simple Definition of Computer What Is Computer and How Its History
Simple Definition of Computer What Is Computer and How Its History
What is Computer - In this century people can say that this is a great invention, this has
manifested the imagination of people, how things can simplify human life and it will lead us
into the modern era. Generally Computers can be defined as "an electronic device or set of
electronic devices that work automatically, integrated and coordinated that can perform
certain tasks (eg receive, store, process and present Data), controlled and controlled by stored
instructions or programs in it (machine)".
Definition of Computer
Before moving on to the definition, we should know the origin of the word computer. The
word computer comes from the Greek word “computare” which means calculate or merge
together. The word “com” means combining in mind or mentally, whereas “putare” means
thinking about computation or merging. Based on “Wikipedia” A computer is a device that
can be instructed to carry out arbitrary sequences of arithmetic or logical operations
automatically. The ability of computers to follow generalized sets of operations,
called programs, enables them to perform an extremely wide range of tasks. But also in
general purpose devices like personal computers and mobile devices such as smartphones.
The Internet is run on computers and it connects millions of other computers.
History of Computer
Today, computers are getting more modern. But, before the modern computer is not as
small, advance, cool and light now. In computer history, there are 5 generations in computer
history.
With the onset of the Second World War, the countries involved in the war sought to develop
computers.This increases the funding of computer development as well as speeds up the
advancement of computer techniques. In 1941, “Konrad Zuse”, a German engineer built a
computer, Z3. In 1943, the British completed a secret code-breaking computer called
“Colossus” to break the secret code used by Germany.
Computers The first generation is characterized by the fact that the operating instructions are
made specifically for a particular task. Each computer has a different binary code program
called "machine language" (machine language). This makes the computer difficult to
program.
2. The Second Generation
In 1948, the invention of the transistor greatly influenced the development of computers.
The transistor replaced the vacuum tube in televisions, radios and computers. As a result, the
size of electric machines is drastically reduced. Transistors began to be used in computers
starting in 1956.The first machine that utilizes this new technology is a supercomputer. IBM
created a supercomputer named Stretch, and Sprery-Rand created a computer called LARC.
Several programming languages began popping up at the time. Common Business-Oriented
Language (COBOL) and Formula Translator (FORTRAN) common programming languages
are commonly used. This programming language replaces complicated machine code with
words, sentences, and mathematical formulas that are more easily understood by humans.
This allows a person to program a computer. A wide range of emerging careers
(programmers, systems analysts, and computer system experts). Software industri also began
to emerge and grow.
Jack Kilby, an engineer at Texas Instrument, developed an integrated circuit (IC: integrated
circuit) in 1958. The IC combines three electronic components in a small silicon disc made of
quartz sand. The scientists then managed to incorporate more components into a single chip
called a semiconductor. The result, the computer becomes smaller because the components
can be compacted in the chip. The advancement of other third generation computers is the
use of operating system (operating system)
This is our generation, the computer you see around you is a computer of the fourth
generation, "Microprocessor" is the main concept behind this generation. In one chip, it
consists of thousands of transistors and other elbow elements connected together.
The development of fourth-generation computers can not be separated from Intel, one of the
chipmakers who have created the Intel 4004 Chip which became the first step in the
development of computer technology
In 1971 IBM created the first computer that was designed specifically for home use and in
1984 Macinthos was first introduced by Apple. Seeing the increase that occurred in
computers in this generation gave birth to the idea to create a computer network, which
ultimately led to the development and birth of the internet. Other great advances in this
generation are the creation of Graphical User Interface (GUI), mouse, and other amazing
advances to produce portable computers that can be taken anywhere or so-called laptops.
The fifth generation computer has technology based on artificial intelligence, and it is still
under development. The purpose of this fifth generation computer development is to produce
a computer that is able to respond with the input language we speak and have the ability to
study the environment and adjust itself.
After reading the history of the computer above and the comparison of each generation
would make you amazed at the achievements that have occurred, I mean just imagine you if
the current computer is still as big as the garage, can you imagine how big the building
needed by an office to store multiple computers? What a thing that does not make sense!,
That's why we should always be grateful to the god who has given us ease in our life “God
Bless You”.
Reference
1. https://en.wikipedia.org/wiki/Computer
2. https://id.wikipedia.org/wiki/Sejarah_komputer