0% found this document useful (0 votes)
29 views

Etymology: Programs

A computer is a machine that can be instructed via programs to carry out logical and arithmetic operations automatically. Computers are used as control systems for industrial equipment, consumer devices, personal computers, smartphones, and to run the internet. Early computers were only for calculations but modern computers follow programs, have semiconductor components, and their power and capabilities have increased dramatically with Moore's Law. A computer typically includes a CPU, memory, peripheral input/output devices, and can be a single system or connected network.

Uploaded by

Arvid Pascual
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Etymology: Programs

A computer is a machine that can be instructed via programs to carry out logical and arithmetic operations automatically. Computers are used as control systems for industrial equipment, consumer devices, personal computers, smartphones, and to run the internet. Early computers were only for calculations but modern computers follow programs, have semiconductor components, and their power and capabilities have increased dramatically with Moore's Law. A computer typically includes a CPU, memory, peripheral input/output devices, and can be a single system or connected network.

Uploaded by

Arvid Pascual
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

A computer is a machine that can be instructed to carry

out sequences of arithmetic or logical operations automatically via computer programming. Modern


computers have the ability to follow generalized sets of operations, called programs. These
programs enable computers to perform an extremely wide range of tasks. A "complete" computer
including the hardware, the operating system (main software), and peripheral equipment required
and used for "full" operation can be referred to as a computer system. This term may as well be
used for a group of computers that are connected and work together, in particular a computer
network or computer cluster.
Computers are used as control systems for a wide variety of industrial and consumer devices. This
includes simple special purpose devices like microwave ovens and remote controls, factory devices
such as industrial robots and computer-aided design, and also general purpose devices
like personal computers and mobile devices such as smartphones. The Internet is run on computers
and it connects hundreds of millions of other computers and their users.
Early computers were only conceived as calculating devices. Since ancient times, simple manual
devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some
mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms.
More sophisticated electrical machines did specialized analog calculations in the early 20th century.
The first digital electronic calculating machines were developed during World War II. The
first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS
transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to
the microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility
of computers have been increasing dramatically ever since then, with MOS transistor
counts increasing at a rapid pace (as predicted by Moore's law), leading to the Digital
Revolution during the late 20th to early 21st centuries.
Conventionally, a modern computer consists of at least one processing element, typically a central
processing unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, along with
some type of computer memory, typically MOS semiconductor memory chips. The processing
element carries out arithmetic and logical operations, and a sequencing and control unit can change
the order of operations in response to stored information. Peripheral devices include input devices
(keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output
devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow
information to be retrieved from an external source and they enable

Etymology

A human computer, with microscope and calculator, 1952


According to the Oxford English Dictionary, the first known use of the word "computer" was in 1613
in a book called The Yong Mans Gleanings by English writer Richard Braithwait: "I haue [sic] read
the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth
thy dayes into a short number." This usage of the term referred to a human computer, a person who
carried out calculations or computations. The word continued with the same meaning until the middle
of the 20th century. During the latter part of this period women were often hired as computers
because they could be paid less than their male counterparts.[1] By 1943, most human computers
were women.[2]
The Online Etymology Dictionary gives the first attested use of "computer" in the 1640s, meaning
"one who calculates"; this is an "agent noun from compute (v.)". The Online Etymology
Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897."
The Online Etymology Dictionary indicates that the "modern use" of the term, to mean
"programmable digital electronic computer" dates from "1945 under this name; [in a] theoretical
[sense] from 1937, as Turing machine".[3]

Source:

https://en.wikipedia.org/wiki/Computer

You might also like