0% found this document useful (0 votes)
86 views

Computer Scientist

The document provides biographical information on three pioneering computer scientists: 1) Alan Turing, a British mathematician considered the father of computer science. He developed the concept of the Turing machine and made seminal contributions to computer science and artificial intelligence. 2) John von Neumann, a Hungarian-American mathematician who made major contributions across many fields including mathematics, physics, economics, and computing. He helped develop the digital computer and cellular automata. 3) Ada Lovelace, an English mathematician and writer who worked on Charles Babbage's Analytical Engine and published the first algorithm intended to be carried out by a computer, making her an early pioneer in computer programming.
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views

Computer Scientist

The document provides biographical information on three pioneering computer scientists: 1) Alan Turing, a British mathematician considered the father of computer science. He developed the concept of the Turing machine and made seminal contributions to computer science and artificial intelligence. 2) John von Neumann, a Hungarian-American mathematician who made major contributions across many fields including mathematics, physics, economics, and computing. He helped develop the digital computer and cellular automata. 3) Ada Lovelace, an English mathematician and writer who worked on Charles Babbage's Analytical Engine and published the first algorithm intended to be carried out by a computer, making her an early pioneer in computer programming.
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 15

Alan Mathison Turing OBE FRS (/ˈtjʊərɪŋ/; 23 June 1912 – 7 June

1954) was an English[6] mathematician, computer


scientist, logician, cryptanalyst, philosopher and theoretical
biologist.[7] Turing was highly influential in the development
of theoretical computer science, providing a formalisation of the
concepts of algorithm and computation with the Turing machine,
which can be considered a model of a general-purpose
computer.[8][9][10] Turing is widely considered to be the father of
theoretical computer science and artificial intelligence.[11] Despite
these accomplishments, he was not fully recognised in his home
country during his lifetime, due to his homosexuality, and because
much of his work was covered by the Official Secrets Act.
During the Second World War, Turing worked for the Government
Code and Cypher School (GC&CS) at Bletchley Park,
Britain's codebreaking centre that produced Ultra intelligence. For a
time he led Hut 8, the section that was responsible for German
naval cryptanalysis. Here, he devised a number of techniques for
speeding the breaking of German ciphers, including improvements
to the pre-war Polish bombe method, an electromechanical machine
that could find settings for the Enigma machine.
Turing played a pivotal role in cracking intercepted coded messages
that enabled the Allies to defeat the Nazis in many crucial
engagements, including the Battle of the Atlantic, and in so doing
helped win the war.[12][13] Due to the problems of counterfactual
history, it's hard to estimate what effect Ultra intelligence had on the
war,[14] but at the upper end it has been estimated that this work
shortened the war in Europe by more than two years and saved
over 14 million lives.[12]
After the war, Turing worked at the National Physical Laboratory,
where he designed the Automatic Computing Engine, which was
one of the first designs for a stored-program computer. In 1948,
Turing joined Max Newman's Computing Machine Laboratory at
the Victoria University of Manchester, where he helped develop
the Manchester computers[15] and became interested
in mathematical biology. He wrote a paper on the chemical basis
of morphogenesis[1] and predicted oscillating chemical
reactions such as the Belousov–Zhabotinsky reaction, first observed
in the 1960s.
Turing was prosecuted in 1952 for homosexual acts;
the Labouchere Amendment of 1885 had mandated that "gross
indecency" was a criminal offence in the UK. He accepted chemical
castration treatment, with DES, as an alternative to prison. Turing
died in 1954, 16 days before his 42nd birthday, from cyanide
poisoning. An inquest determined his death as a suicide, but it has
been noted that the known evidence is also consistent with
accidental poisoning.
In 2009, following an Internet campaign, British Prime
Minister Gordon Brown made an official public apology on behalf of
the British government for "the appalling way he was
treated". Queen Elizabeth II granted Turing a posthumous pardon in
2013. The Alan Turing law is now an informal term for a 2017 law in
the United Kingdom that retroactively pardoned men cautioned or
convicted under historical legislation that outlawed homosexual
acts.[16]
On 15 July 2019 the Bank of England announced that Turing would
be depicted on the United Kingdom's new £50 note.
Donald Ervin Knuth (/kəˈnuːθ/[3] kə-NOOTH; born January 10, 1938)
is an American computer scientist, mathematician, and professor
emeritus at Stanford University. He is the 1974 recipient of the ACM
Turing Award, informally considered the Nobel Prize of computer
science.[4][5]
He is the author of the multi-volume work The Art of Computer
Programming. He contributed to the development of the rigorous
analysis of the computational complexity of algorithms and
systematized formal mathematical techniques for it. In the process
he also popularized the asymptotic notation. In addition to
fundamental contributions in several branches of theoretical
computer science, Knuth is the creator of the TeX computer
typesetting system, the related METAFONT font definition language
and rendering system, and the Computer Modern family of
typefaces.
As a writer and scholar, Knuth created
the WEB and CWEB computer programming systems designed to
encourage and facilitate literate programming, and designed
the MIX/MMIX instruction set architectures. Knuth strongly opposes
granting software patents, having expressed his opinion to
the United States Patent and Trademark Office and European
Patent Organisation.
John von Neumann (/vɒn ˈnɔɪmən/; Hungarian: Neumann János
Lajos, pronounced [ˈnɒjmɒn ˈjaːnoʃ ˈlɒjoʃ]; December 28, 1903 –
February 8, 1957) was a Hungarian-
American mathematician, physicist, computer scientist,
and polymath. Von Neumann was generally regarded as the
foremost mathematician of his time[2] and said to be "the last
representative of the great mathematicians";[3] a genius who was
comfortable integrating both pure and applied sciences.
He made major contributions to a number of fields,
including mathematics (foundations of mathematics, functional
analysis, ergodic theory, representation theory, operator
algebras, geometry, topology, and numerical
analysis), physics (quantum mechanics, hydrodynamics,
and quantum statistical mechanics), economics (game
theory), computing (Von Neumann architecture, linear
programming, self-replicating machines, stochastic computing),
and statistics.
He was a pioneer of the application of operator theory to quantum
mechanics in the development of functional analysis, and a key
figure in the development of game theory and the concepts
of cellular automata, the universal constructor and the digital
computer.
He published over 150 papers in his life: about 60 in pure
mathematics, 60 in applied mathematics, 20 in physics, and the
remainder on special mathematical subjects or non-mathematical
ones.[4] His last work, an unfinished manuscript written while he was
in hospital, was later published in book form as The Computer and
the Brain.
His analysis of the structure of self-replication preceded the
discovery of the structure of DNA. In a short list of facts about his
life he submitted to the National Academy of Sciences, he stated,
"The part of my work I consider most essential is that on quantum
mechanics, which developed in Göttingen in 1926, and
subsequently in Berlin in 1927–1929. Also, my work on various
forms of operator theory, Berlin 1930 and Princeton 1935–1939; on
the ergodic theorem, Princeton, 1931–1932."
During World War II, von Neumann worked on the Manhattan
Project with theoretical physicist Edward Teller,
mathematician Stanisław Ulam and others, problem solving key
steps in the nuclear physics involved in thermonuclear reactions
and the hydrogen bomb. He developed the mathematical models
behind the explosive lenses used in the implosion-type nuclear
weapon, and coined the term "kiloton" (of TNT), as a measure of the
explosive force generated.
After the war, he served on the General Advisory Committee of
the United States Atomic Energy Commission, and consulted for a
number of organizations, including the United States Air Force, the
Army's Ballistic Research Laboratory, the Armed Forces Special
Weapons Project, and the Lawrence Livermore National Laboratory.
As a Hungarian émigré, concerned that the Soviets would achieve
nuclear superiority, he designed and promoted the policy
of mutually assured destruction to limit the arms race.
Augusta Ada King, Countess of Lovelace (née Byron;
10 December 1815 – 27 November 1852) was an
English mathematicianand writer, chiefly known for her work
on Charles Babbage's proposed mechanical general-purpose
computer, the Analytical Engine. She was the first to recognise that
the machine had applications beyond pure calculation, and
published the first algorithm intended to be carried out by such a
machine. As a result, she is sometimes regarded as the first to
recognise the full potential of a "computing machine" and one of the
first computer programmers.[2][3][4]
Lovelace was the only legitimate child of poet Lord Byron and his
wife Lady Byron.[5] All of Byron's other children were born out of
wedlock to other women.[6] Byron separated from his wife a month
after Ada was born and left England forever four months later. He
commemorated the parting in a poem that begins, "Is thy face like
thy mother's my fair child! ADA! sole daughter of my house and
heart?".[7] He died of disease in the Greek War of
Independence when Ada was eight years old. Her mother remained
bitter and promoted Ada's interest in mathematics and logic in an
effort to prevent her from developing her father's perceived insanity.
Despite this, Ada remained interested in Byron. Upon her eventual
death, she was buried next to him at her request. Although often ill
in her childhood, Ada pursued her studies assiduously. She
married William King in 1835. King was made Earl of Lovelace in
1838, Ada thereby becoming Countess of Lovelace.
Her educational and social exploits brought her into contact with
scientists such as Andrew Crosse, Charles Babbage, Sir David
Brewster, Charles Wheatstone, Michael Faraday and the
author Charles Dickens, contacts which she used to further her
education. Ada described her approach as "poetical science"[8] and
herself as an "Analyst (& Metaphysician)".[9]
When she was a teenager, her mathematical talents led her to a
long working relationship and friendship with fellow British
mathematician Charles Babbage, who is known as "the father of
computers". She was in particular interested in Babbage's work on
the Analytical Engine. Lovelace first met him in June 1833, through
their mutual friend, and her private tutor, Mary Somerville.
Between 1842 and 1843, Ada translated an article by Italian military
engineer Luigi Menabrea on the calculating engine, supplementing
it with an elaborate set of notes, simply called Notes. These notes
contain what many consider to be the first computer program—that
is, an algorithm designed to be carried out by a machine. Other
historians reject this perspective and point out that Babbage's
personal notes from the years 1836/1837 contain the first programs
for the engine.[10] Lovelace's notes are important in the early history
of computers. She also developed a vision of the capability of
computers to go beyond mere calculating or number-crunching,
while many others, including Babbage himself, focused only on
those capabilities.[11] Her mindset of "poetical science" led her to ask
questions about the Analytical Engine (as shown in her notes)
examining how individuals and society relate to technology as a
collaborative tool.[6]
Edsger Wybe Dijkstra (/ˈdaɪkstrə/; Dutch: [ˈɛtsxər ˈʋibə ˈdɛikstra] (
listen); 11 May 1930 – 6 August 2002) was a Dutch systems
scientist, programmer, software engineer,
science essayist,[8] and pioneer in computing science.[9] A theoretical
physicist by training, he worked as a programmer at
the Mathematisch Centrum (Amsterdam) from 1952 to 1962. A
university professor for much of his life, Dijkstra held the
Schlumberger Centennial Chair in Computer Sciences at
the University of Texas at Austin from 1984 until his retirement in
1999. He was a professor of mathematics at the Eindhoven
University of Technology (1962–1984) and a research fellow at
the Burroughs Corporation (1973–1984).
One of the most influential figures of computing science's founding
generation, Dijkstra helped shape the new discipline from both an
engineering and a theoretical perspective.[10][11] His fundamental
contributions cover diverse areas of computing science,
including compiler construction, operating systems, distributed
systems, sequential and concurrent programming, programming
paradigm and methodology, programming language research,
program design, program development, program verification,
software engineering principles, graph algorithms, and philosophical
foundations of computer programming and computer science. Many
of his papers are the source of new research areas. Several
concepts and problems that are now standard in computer science
were first identified by Dijkstra or bear names coined by him.[12][13] As
a foremost opponent of the mechanizing view of computing science,
he refuted the use of the concepts of 'computer science' and
'software engineering' as umbrella terms for academic disciplines.
Until the mid-1960s computer programming was considered more
an art (or a craft) than a scientific discipline. In Harlan Mills's words
(1986), "programming [before the 1970s] was regarded as a private,
puzzle-solving activity of writing computer instructions to work as a
program". In the late 1960s, computer programming was in a state
of crisis. Dijkstra was one of a small group of academics and
industrial programmers who advocated a new programming style to
improve the quality of programs. Dijkstra, who had a background in
mathematics and physics, was one of the driving forces behind the
acceptance of computer programming as a scientific
discipline.[14][15] He coined the phrase "structured programming" and
during the 1970s this became the new programming
orthodoxy.[16][17][18] His ideas about structured programming helped lay
the foundations for the birth and development of the professional
discipline of software engineering, enabling programmers to
organize and manage increasingly complex software
projects.[19][20] As Bertrand Meyer (2009) noted, "The revolution in
views of programming started by Dijkstra's iconoclasm led to a
movement known as structured programming, which advocated a
systematic, rational approach to program construction. Structured
programming is the basis for all that has been done since
in programming methodology, including object-oriented
programming."[21]
The academic study of concurrent computing started in the 1960s,
with Dijkstra (1965) credited with being the first paper in this field,
identifying and solving the mutual exclusion problem.[22][23] He was
also one of the early pioneers of the research on principles
of distributed computing. His foundational work
on concurrency, semaphores, mutual exclusion, deadlock (deadly
embrace), finding shortest paths in graphs, fault-tolerance, self-
stabilization, among many other contributions comprises many of
the pillars upon which the field of distributed computing is built.
Shortly before his death in 2002, he received the ACM PODC
Influential-Paper Award in distributed computing for his work on self-
stabilization of program computation. This annual award was
renamed the Dijkstra Prize(Edsger W. Dijkstra Prize in Distributed
Computing) the following year, in his honor.[24][25][26] As the prize,
sponsored jointly by the ACM Symposium on Principles of
Distributed Computing (PODC) and the EATCS International
Symposium on Distributed Computing(DISC), recognizes that "No
other individual has had a larger influence on research in principles
of distributed computing".
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was
an American mathematician, electrical engineer,
and cryptographer known as "the father of information
theory".[1][2] Shannon is noted for having founded information theory
with a landmark paper, "A Mathematical Theory of Communication",
that he published in 1948.
He is also well known for founding digital circuit design theory in
1937, when—as a 21-year-old master's degree student at
the Massachusetts Institute of Technology (MIT)—he wrote his
thesis demonstrating that electrical applications of Boolean
algebracould construct any logical numerical
relationship.[3] Shannon contributed to the field of cryptanalysis for
national defense during World War II, including his fundamental
work on codebreaking and secure telecommunications.
Robert Elliot Kahn (born December 23, 1938) is an
American electrical engineer, who, along with Vint Cerf, invented
the Transmission Control Protocol (TCP) and the Internet
Protocol (IP), the fundamental communication protocols at the heart
of the Internet.
In 2004, Kahn won the Turing Award with Vint Cerf for their work on
TCP/IP.[1]
Grace Brewster Murray Hopper (née Murray December 9, 1906 –
January 1, 1992) was an American computer scientist and United
States Navy rear admiral.[1] One of the first programmers of
the Harvard Mark I computer, she was a pioneer of computer
programming who invented one of the first linkers. She popularized
the idea of machine-independent programming languages, which
led to the development of COBOL, an early high-level programming
language still in use today.
Prior to joining the Navy, Hopper earned a Ph.D. in mathematics
from Yale University and was a professor of mathematics at Vassar
College. Hopper attempted to enlist in the Navy during World War
II but was rejected because she was 34 years old. She instead
joined the Navy Reserves. Hopper began her computing career in
1944 when she worked on the Harvard Mark I team led by Howard
H. Aiken. In 1949, she joined the Eckert–Mauchly Computer
Corporation and was part of the team that developed the UNIVAC
Icomputer. At Eckert–Mauchly she began developing the compiler.
She believed that a programming language based on English was
possible. Her compiler converted English terms into machine
code understood by computers. By 1952, Hopper had finished her
program linker (originally called a compiler), which was written for
the A-0 System.[2][3][4][5] During her wartime service, she co-authored
three papers based on her work on the Harvard Mark 1.
In 1954, Eckert–Mauchly chose Hopper to lead their department for
automatic programming, and she led the release of some of the first
compiled languages like FLOW-MATIC. In 1959, she participated in
the CODASYL consortium, which consulted Hopper to guide them
in creating a machine-independent programming language. This led
to the COBOL language, which was inspired by her idea of a
language being based on English words. In 1966, she retired from
the Naval Reserve, but in 1967 the Navy recalled her to active duty.
She retired from the Navy in 1986 and found work as a consultant
for the Digital Equipment Corporation, sharing her computing
experiences.
The U.S. Navy Arleigh Burke-class guided-missile
destroyer USS Hopper was named for her, as was the Cray
XE6 "Hopper" supercomputer at NERSC.[6] During her lifetime,
Hopper was awarded 40 honorary degrees from universities across
the world. A college at Yale University was renamed in her honor. In
1991, she received the National Medal of Technology. On
November 22, 2016, she was posthumously awarded
the Presidential Medal of Freedom by President Barack Obama.[7]
Geoffrey Everest Hinton CC FRS FRSC[11] (born 6 December 1947)
is an English Canadian cognitive psychologist and computer
scientist, most noted for his work on artificial neural networks. Since
2013 he divides his time working for Google (Google Brain) and
the University of Toronto.[12][13]
With David E. Rumelhart and Ronald J. Williams, Hinton was co-
author of a highly cited paper published in 1986 that popularized
the backpropagation algorithm for training multi-layer neural
networks,[14] although they were not the first to propose the
approach.[15]Hinton is viewed by some as a leading figure in
the deep learning community and is referred to by some as the
"Godfather of Deep Learning".[16][17][18][19][20] The dramatic image-
recognition milestone of the AlexNet designed by his student Alex
Krizhevsky[21] for the Imagenet challenge 2012[22] helped to
revolutionize the field of computer vision.[23] Hinton was awarded the
2018 Turing Prizealongside Yoshua Bengio and Yann LeCun for
their work on deep learning.[24]
David Andrew Patterson (born November 16, 1947) is
an American computer pioneer and academic who has held the
position of Professor of Computer Science at the University of
California, Berkeley since 1976. He announced retirement in 2016
after serving nearly forty years, becoming a distinguished engineer
at Google.[3][4] He currently is Vice Chair of the Board of Directors of
the RISC-V Foundation,[5] and the Pardee Professor of Computer
Science, Emeritus at UC Berkeley.
Patterson is noted for his pioneering contributions
to RISC processor design, having coined the term RISC, and by
leading the Berkeley RISC project.[6]As of 2018, 99% of all new
chips use a RISC architecture.[7][8] He is also noted for leading the
research on RAID storage together with Randy Katz.[9]
His books on computer architecture (co-authored with John L.
Hennessy) are widely used in computer science education. Along
with Hennessy, Patterson won the 2017 Turing Award for their work
in developing RISC.

You might also like