0% found this document useful (0 votes)
67 views7 pages

11 - 2 PJJ - T.komputer - Gilang Romadhon Nugroho - 19014111012

This document is a report submitted by Gilang Romadhon Nugroho for their Programming Applications class. It contains responses to 3 questions: 1) How to face challenges of Industry 4.0, 2) Thoughts on disruptive technology and an example, 3) What is a computer and a brief history. The history section describes early counting devices, the abacus, Antikythera mechanism, and advances through slide rules and Babbage's analytical engine, concluding with Turing's model of a modern computer.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views7 pages

11 - 2 PJJ - T.komputer - Gilang Romadhon Nugroho - 19014111012

This document is a report submitted by Gilang Romadhon Nugroho for their Programming Applications class. It contains responses to 3 questions: 1) How to face challenges of Industry 4.0, 2) Thoughts on disruptive technology and an example, 3) What is a computer and a brief history. The history section describes early counting devices, the abacus, Antikythera mechanism, and advances through slide rules and Babbage's analytical engine, concluding with Turing's model of a modern computer.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Komputer (Aplikasi Pemrograman)

Nama : Gilang Romadhon Nugroho


NIM : 1901411012
Kelas : 2 PJJ
Tugas Ke- : 1
Tanggal : 18/09/2020
Dosen : Arliandy P. Arbad, S.T., M.Eng.

1.Describe and explain how we face the challenges in the era revolution of industry
4.0?

Industry 4.0 is characterized by the level of automation that we have achieved, where
machines can often largely govern themselves, in many ways by using internet technologies,
or “the Internet of things.” Other features of Industry 4.0 include the use of cloud technology
and the importance of big data.

And to face the challanges we must have a basic skils like,Complex problem solving,People
management,Creativity,Critical ,thinkingCoordinating with other,Cognitive,flexibility,Emotion
intelligence,Service orientation,Judgment ,and decision making Negotiation

2. What do you think about Disruption Technology Era ? Give an example

Disruptive technology is an innovation that significantly alters the way that consumers,
industries, or businesses operate. A disruptive technology sweeps away the systems or
habits it replaces because it has attributes that are recognizably superior.

In civil enginering previously, engineers use conventional methods for construction but
because disruption now we use the Building Information Modeling method like,Skecth
up,civil 3d, Lumion and many more. which is good because it’s save time and costs in
construction.

3. What is Computer? And explain the history of computer

A computer is a machine that can be instructed to carry out sequences of arithmetic or


logical operations automatically via computer programming. Modern computers have the
ability to follow generalized sets of operations, called programs. These programs enable
computers to perform an extremely wide range of tasks. A "complete" computer including the
hardware, the operating system (main software), and peripheral equipment required and
used for "full" operation can be referred to as a computer system.

Pre-20th century

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett
Komputer (Aplikasi Pemrograman)

Devices have been used to aid computation for thousands of years, mostly using one-to-one
correspondence with fingers. The earliest counting device was probably a form of tally stick.
Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres,
cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow
unbaked clay containers.[4][5] The use of counting rods is one example.

The abacus was initially used for arithmetic tasks. The Roman abacus was developed from
devices used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning
boards or tables have been invented. In a medieval European counting house, a checkered
cloth would be placed on a table, and markers moved around on it according to certain rules,
as an aid to calculating sums of money.[6]

The Antikythera mechanism is believed to be the earliest mechanical analog "computer",


according to Derek J. de Solla Price.[7] It was designed to calculate astronomical positions.
It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera,
between Kythera and Crete, and has been dated to c. 100 BC. Devices of a level of
complexity comparable to that of the Antikythera mechanism would not reappear until a
thousand years later.

Many mechanical aids to calculation and measurement were constructed for astronomical
and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in
the early 11th century.[8] The astrolabe was invented in the Hellenistic world in either the 1st
or 2nd centuries BC and is often attributed to Hipparchus. A combination of the planisphere
and dioptra, the astrolabe was effectively an analog computer capable of working out several
different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical
calendar computer[9][10] and gear-wheels was invented by Abi Bakr of Isfahan, Persia in
1235.[11] Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar
astrolabe,[12] an early fixed-wired knowledge processing machine[13] with a gear train and
gear-wheels,[14] c. 1000 AD.

The sector, a calculating instrument used for solving problems in proportion, trigonometry,
multiplication and division, and for various functions, such as squares and cube roots, was
developed in the late 16th century and found application in gunnery, surveying and
navigation.

The planimeter was a manual instrument to calculate the area of a closed figure by tracing
over it with a mechanical linkage.

The slide rule was invented around 1620–1630, shortly after the publication of the concept of
the logarithm. It is a hand-operated analog computer for doing multiplication and division. As
slide rule development progressed, added scales provided reciprocals, squares and square
roots, cubes and cube roots, as well as transcendental functions such as logarithms and
exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with
special scales are still used for quick performance of routine calculations, such as the E6B
circular slide rule used for time and distance calculations on light aircraft.

In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton)
that could write holding a quill pen. By switching the number and order of its internal wheels
different letters, and hence different messages, could be produced. In effect, it could be

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett
Komputer (Aplikasi Pemrograman)

mechanically "programmed" to read instructions. Along with two other complex machines,
the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates.[15]

In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar


machine, which, though a system of pulleys and cylinders and over, could predict the
perpetual calendar for every year from AD 0 (that is, 1 BC) to AD 4000, keeping track of leap
years and varying day length. The tide-predicting machine invented by Sir William Thomson
in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and
wires to automatically calculate predicted tide levels for a set period at a particular location.

The differential analyser, a mechanical analog computer designed to solve differential


equations by integration, used wheel-and-disc mechanisms to perform the integration. In
1876, Lord Kelvin had already discussed the possible construction of such calculators, but
he had been stymied by the limited output torque of the ball-and-disk integrators.[16] In a
differential analyzer, the output of one integrator drove the input of the next integrator, or a
graphing output. The torque amplifier was the advance that allowed these machines to work.
Starting in the 1920s, Vannevar Bush and others developed mechanical differential
analyzers.

First computing device

Charles Babbage, an English mechanical engineer and polymath, originated the concept of
a programmable computer. Considered the "father of the computer",[17] he conceptualized
and invented the first mechanical computer in the early 19th century. After working on his
revolutionary difference engine, designed to aid in navigational calculations, in 1833 he
realized that a much more general design, an Analytical Engine, was possible. The input of
programs and data was to be provided to the machine via punched cards, a method being
used at the time to direct mechanical looms such as the Jacquard loom. For output, the
machine would have a printer, a curve plotter and a bell. The machine would also be able to
punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic
unit, control flow in the form of conditional branching and loops, and integrated memory,
making it the first design for a general-purpose computer that could be described in modern
terms as Turing-complete.[18][19]

The machine was about a century ahead of its time. All the parts for his machine had to be
made by hand – this was a major problem for a device with thousands of parts. Eventually,
the project was dissolved with the decision of the British Government to cease funding.
Babbage's failure to complete the analytical engine can be chiefly attributed to political and
financial difficulties as well as his desire to develop an increasingly sophisticated computer
and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry
Babbage, completed a simplified version of the analytical engine's computing unit (the mill)
in 1888. He gave a successful demonstration of its use in computing tables in 1906.

Modern computers

Concept of modern computer

The principle of the modern computer was proposed by Alan Turing in his seminal 1936
paper,[41] On Computable Numbers. Turing proposed a simple device that he called
"Universal Computing machine" and that is now known as a universal Turing machine. He

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett
Komputer (Aplikasi Pemrograman)

proved that such a machine is capable of computing anything that is computable by


executing instructions (program) stored on tape, allowing the machine to be programmable.
The fundamental concept of Turing's design is the stored program, where all the instructions
for computing are stored in memory. Von Neumann acknowledged that the central concept
of the modern computer was due to this paper.[42] Turing machines are to this day a central
object of study in theory of computation. Except for the limitations imposed by their finite
memory stores, modern computers are said to be Turing-complete, which is to say, they
have algorithm execution capability equivalent to a universal Turing machine.

Stored programs

Main article: Stored-program computer

Early computing machines had fixed programs. Changing its function required the re-wiring
and re-structuring of the machine.[30] With the proposal of the stored-program computer this
changed. A stored-program computer includes by design an instruction set and can store in
memory a set of instructions (a program) that details the computation. The theoretical basis
for the stored-program computer was laid by Alan Turing in his 1936 paper. In 1945, Turing
joined the National Physical Laboratory and began work on developing an electronic stored-
program digital computer. His 1945 report "Proposed Electronic Calculator" was the first
specification for such a device. John von Neumann at the University of Pennsylvania also
circulated his First Draft of a Report on the EDVAC in 1945.[20]

The Manchester Baby was the world's first stored-program computer. It was built at the
Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and
ran its first program on 21 June 1948.[43] It was designed as a testbed for the Williams tube,
the first random-access digital storage device.[44] Although the computer was considered
"small and primitive" by the standards of its time, it was the first working machine to contain
all of the elements essential to a modern electronic computer.[45] As soon as the Baby had
demonstrated the feasibility of its design, a project was initiated at the university to develop it
into a more usable computer, the Manchester Mark 1. Grace Hopper was the first person to
develop a compiler for programming language.[2]

The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first
commercially available general-purpose computer.[46] Built by Ferranti, it was delivered to
the University of Manchester in February 1951. At least seven of these later machines were
delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[47] In October
1947, the directors of British catering company J. Lyons & Company decided to take an
active role in promoting the commercial development of computers. The LEO I computer
became operational in April 1951[48] and ran the world's first regular routine office computer
job.

Transistors

Main articles: Transistor and History of the transistor

Further information: Transistor computer and MOSFET

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John
Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett
Komputer (Aplikasi Pemrograman)

first working transistor, the point-contact transistor, in 1947, which was followed by
Shockley's bipolar junction transistor in 1948.[49][50] From 1955 onwards, transistors
replaced vacuum tubes in computer designs, giving rise to the "second generation" of
computers. Compared to vacuum tubes, transistors have many advantages: they are
smaller, and require less power than vacuum tubes, so give off less heat. Junction
transistors were much more reliable than vacuum tubes and had longer, indefinite, service
life. Transistorized computers could contain tens of thousands of binary logic circuits in a
relatively compact space. However, early junction transistors were relatively bulky devices
that were difficult to manufacture on a mass-production basis, which limited them to a
number of specialised applications.[51]

At the University of Manchester, a team under the leadership of Tom Kilburn designed and
built a machine using the newly developed transistors instead of valves.[52] Their first
transistorised computer and the first in the world, was operational by 1953, and a second
version was completed there in April 1955. However, the machine did make use of valves to
generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic
drum memory, so it was not the first completely transistorized computer. That distinction
goes to the Harwell CADET of 1955,[53] built by the electronics division of the Atomic
Energy Research Establishment at Harwell.[53][54]

The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor,
was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959.[55] It was the
first truly compact transistor that could be miniaturised and mass-produced for a wide range
of uses.[51] With its high scalability,[56] and much lower power consumption and higher
density than bipolar junction transistors,[57] the MOSFET made it possible to build high-
density integrated circuits.[58][59] In addition to data processing, it also enabled the practical
use of MOS transistors as memory cell storage elements, leading to the development of
MOS semiconductor memory, which replaced earlier magnetic-core memory in computers.
The MOSFET led to the microcomputer revolution,[60] and became the driving force behind
the computer revolution.[61][62] The MOSFET is the most widely used transistor in
computers,[63][64] and is the fundamental building block of digital electronics.[65]

Integrated circuits

Main articles: Integrated circuit and Invention of the integrated circuit

Further information: Planar process and Microprocessor

The next great advance in computing power came with the advent of the integrated circuit
(IC). The idea of the integrated circuit was first conceived by a radar scientist working for the
Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer
presented the first public description of an integrated circuit at the Symposium on Progress
in Quality Electronic Components in Washington, D.C. on 7 May 1952.[66]

The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at
Fairchild Semiconductor.[67] Kilby recorded his initial ideas concerning the integrated circuit
in July 1958, successfully demonstrating the first working integrated example on 12
September 1958.[68] In his patent application of 6 February 1959, Kilby described his new
device as "a body of semiconductor material ... wherein all the components of the electronic
circuit are completely integrated".[69][70] However, Kilby's invention was a hybrid integrated

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett
Komputer (Aplikasi Pemrograman)

circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip.[71] Kilby's IC had
external wire connections, which made it difficult to mass-produce.[72]

Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[73]
Noyce's invention was the first true monolithic IC chip.[74][72] His chip solved many practical
problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon,
whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using
the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar
process was based on the silicon surface passivation and thermal oxidation processes
developed by Mohamed Atalla at Bell Labs in the late 1950s.[75][76][77]

Modern monolithic ICs are predominantly MOS (metal-oxide-semiconductor) integrated


circuits, built from MOSFETs (MOS transistors).[78] After the first MOSFET was invented by
Mohamed Atalla and Dawon Kahng at Bell Labs in 1959,[79] Atalla first proposed the
concept of the MOS integrated circuit in 1960, followed by Kahng in 1961, both noting that
the MOS transistor's ease of fabrication made it useful for integrated circuits.[51][80] The
earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman
and Steven Hofstein at RCA in 1962.[81] General Microelectronics later introduced the first
commercial MOS IC in 1964,[82] developed by Robert Norman.[81] Following the
development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald
Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned
gates was developed by Federico Faggin at Fairchild Semiconductor in 1968.[83] The
MOSFET has since become the most critical device component in modern ICs.[84]

The development of the MOS integrated circuit led to the invention of the
microprocessor,[85][86] and heralded an explosion in the commercial and personal use of
computers. While the subject of exactly which device was the first microprocessor is
contentious, partly due to lack of agreement on the exact definition of the term
"microprocessor", it is largely undisputed that the first single-chip microprocessor was the
Intel 4004,[87] designed and realized by Federico Faggin with his silicon-gate MOS IC
technology,[85] along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[88][89] In
the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors
on a single chip.[59]

System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a
coin.[90] They may or may not have integrated RAM and flash memory. If not integrated,
The RAM is usually placed directly above (known as Package on package) or below (on the
opposite side of the circuit board) the SoC, and the flash memory is usually placed right next
to the SoC, this all done to improve data transfer speeds, as the data signals don't have to
travel long distances. Since ENIAC in 1945, computers have advanced enormously, with
modern SoCs (Such as the Snapdragon 865) being the size of a coin while also being
hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors,
and consuming only a few watts of power.

Mobile computers

The first mobile computers were heavy and ran from mains power. The 50lb IBM 5100 was
an early example. Later portables such as the Osborne 1 and Compaq Portable were
considerably lighter but still needed to be plugged in. The first laptops, such as the Grid

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett
Komputer (Aplikasi Pemrograman)

Compass, removed this requirement by incorporating batteries – and with the continued
miniaturization of computing resources and advancements in portable battery life, portable
computers grew in popularity in the 2000s.[91] The same developments allowed
manufacturers to integrate computing resources into cellular mobile phones by the early
2000s.

These smartphones and tablets run on a variety of operating systems and recently became
the dominant computing device on the market.[92] These are powered by System on a Chip
(SoCs), which are complete computers on a microchip the size of a coin.[90] (Encyclopedia,
2020)

4. If you were an application pragrammer, what would you make (the application)?

I will crete program to analysis bending force

“Honesty is a very expensive gift. Don’t expect it from cheap people.” Warren Buffett

You might also like