HISTORY OF PROGRAMMING LANGUAGES
HISTORY OF PROGRAMMING LANGUAGES
Konrad Zuse was a German civil and computer engineer. He invented the first
programmable computer with a binary code, the Z1 in 1938. During WWII, he built
the Z2, an improved version of the Z1. Soon after, he constructed the Z3, the first
computer to be a universal Turing Machine and therefore functional by modern
standards. The fourth computer Zuse created, the Z4, gave him the idea for
Plankalkul. He realized that the basic machine code was too complicated to be easily
used by computer programmers, so he invented a language which would simplify the
process by using words that could be translated into machine code. He named this
language “Plankalkul”. Unfortunately Zuse was never able to introduce Plankalkul to
the computer world because without a compiler, the language could not be used.
However, it was an example for languages that are still used today.
Significance
Though it was never actually used, the significance of Plankalkul in computer history
was that it set an example for other programming languages such as Superplan
written by Heinz Rutishauser and even modern languages such as C++ and Java.
Without Zuse’s revolutionary ideas, programmers might be using machine code to
write programs, which would be much more complex and tedious. Thus Plankalkul
was a major invention in the history of computers.
FLOW- MATIC
(https://en.wikipedia.org/wiki/FLOW-MATIC)
Hopper had found that business data processing customers were uncomfortable
with mathematical notation. In late 1953 she proposed that data processing
problems should be expressed using English keywords, but Rand management
considered the idea unfeasible. In early 1955, she and her team wrote a specification
for such a programming language and implemented a prototype. The FLOW-MATIC
compiler became publicly available in early 1958 and was substantially complete in
1959.
FORTRAN
(http://www.softwarepreservation.org/projects/FORTRAN/paper/p25-backus.pdf)
Before 1954 almost all programming was done in machine language or assembly
language.
Another factor which influenced the development of FORTRAN was the economics of
programming in 1954. The cost of programmers associated with a computer center
was usually at least as great as the cost of the computer itself. (This fact follows from
the average salary-plus-overhead and number of programmers at each center and
from the computer rental figures.) In addition, from one-quarter to one-half of the
computer's time was spent in debugging. Thus programming and debugging
accounted for as much as three26 Part II Paper: The History of FORTRAN I, II, and III
quarters of the cost of operating a computer; and obviously, as computers got
cheaper, this situation would get worse. This economic factor was one of the prime
motivations which led me to propose the FORTRAN project in a letter to my boss,
Cuthbert Hurd, in late 1953 (the exact date is not known but other facts suggest
December 1953 as a likely date). I believe that the economic need for a system like
FORTRAN was one reason why IBM and my successive bosses, Hurd, Charles
DeCarlo, and John McPherson, provided for our constantly expanding needs over the
next five years without ever asking us to project or justify those needs in a formal
budget.
LISP
(https://en.wikipedia.org/wiki/Lisp_(programming_language))
John McCarthy developed Lisp in 1958 while he was at the Massachusetts Institute
of Technology (MIT). McCarthy published its design in a paper in Communications of
the ACM in 1960, entitled "Recursive Functions of Symbolic Expressions and Their
Computation by Machine, Part I". He showed that with a few simple operators and a
notation for functions, one can build a Turing-complete language for algorithms.
ALGOL
(https://en.wikipedia.org/wiki/ALGOL_58)
There were proposals for a universal language by the Association for Computing
Machinery (ACM) and also by the German Gesellschaft für Angewandte Mathematik
und Mechanik ("Society of Applied Mathematics and Mechanics") (GAMM). It was
decided to organize a joint meeting to combine them. The meeting took place from
May 27 to June 2, 1958, at ETH Zurich and was attended by the following people:
Friedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, and Klaus
Samelson (from the GAMM)
John Backus, Charles Katz, Alan Perlis, and Joseph Henry Wegstein (from the ACM).
The language was originally proposed to be called IAL (International Algebraic
Language) but according to Perlis this was rejected as an "'unspeakable' and
pompous acronym". ALGOL was suggested instead, though not officially adopted
until a year later. The publication following the meeting still used the name IAL.
By the end of 1958 the ZMMD-group had built a working ALGOL 58 compiler for
the Z22 computer. ZMMD was an abbreviation for Zürich (where Rutishauser
worked), München (workplace of Bauer and Samelson), Mainz (location of the Z22
computer), Darmstadt (workplace of Bottenbruch).
ALGOL 58 saw some implementation effort at IBM, but the effort was in competition
with FORTRAN, and soon abandoned. It was also implemented at Dartmouth
College on an LGP-30, but that implementation soon evolved into Algol 60. An
implementation for the Burroughs 220 called BALGOL evolved along its own lines as
well, but retained much of ALGOL 58's original character.
ALGOL 58's primary contribution was to later languages; it was used as a basis
for JOVIAL, MAD, NELIAC and ALGO. It was also used during 1959 to
publish algorithms in CACM, beginning a trend of using ALGOL notation in
publication that continued for many years.
FACT
(https://en.wikipedia.org/wiki/FACT_(computer_language))
COBOL
(https://web.archive.org/web/20140107192608/http://home.comcast.net/
~wmklein/DOX/History.pdf)
In the late 1950s, computer users and manufacturers were becoming concerned
about the rising cost of programming. A 1959 survey had found that in any data
processing installation, the programming cost US$800,000 on average and that
translating programs to run on new hardware would cost $600,000. At a time when
new programming languages were proliferating at an ever-increasing rate, the same
survey suggested that if a common business-oriented language were used,
conversion would be far cheaper and faster.COBOL is an industry language and is not
the property of any company or group of companies, or of any organization or group
of organizations.
RPG
(https://en.wikipedia.org/wiki/IBM_RPG)
RPG is one of the few languages created for punched card machines that are still in
common use today. This is because the language has evolved considerably over time.
It was originally developed by IBM in 1959. The name Report Program
Generator was descriptive of the purpose of the language: generation of reports
from data files, including matching record and sub-total reports.
APL
(https://en.wikipedia.org/wiki/APL_(programming_language))
Simula
(https://hannemyr.com/cache/knojd_acm78.pdf)
Kristen Nygaard started writing computer simulation programs in 1957. Nygaard saw
a need for a better way to describe the heterogeneity and the operation of a system.
To go further with his ideas on a formal computer language for describing a system,
Nygaard realized that he needed someone with more computer programming skills
than he had. Ole-Johan Dahl joined him on his work January 1962. The decision of
linking the language up to ALGOL 60 was made shortly after. By May 1962 the main
concepts for a simulation language were set. "SIMULA I" was born, a special purpose
programming language for simulating discrete event systems.
Speakeasy
(https://en.wikipedia.org/wiki/Speakeasy_(computational_environment))
BASIC
(https://www.thoughtco.com/history-basic-programming-language-1991662)
In the 1960s, computers ran on gigantic mainframe machines, requiring their special
rooms with powerful air-conditioning to keep them cool. The mainframes received
their instructions from punch cards by computer operators, and any instructions
given to a mainframe required writing a new piece of software, which was the realm
of mathematicians and nascent computer scientists.
BASIC, a computer language written at Dartmouth college in 1963, would change
that.
Beginnings of BASIC
The language BASIC was an acronym for Beginner's All-Purpose Symbolic Instruction
Code. It was developed by Dartmouth mathematicians John George Kemeny and
Tom Kurtzas as a teaching tool for undergraduates. BASIC was intended to be a
computer language for generalists to use to unlock the power of the computer in
business and other realms of academia. BASIC was traditionally one of the most
commonly used computer programming languages, considered an easy step for
students to learn before more powerful languages such as FORTRAN. Until very
recently, BASIC (in the form of Visual BASIC and Visual BASIC .NET) was the most
widely known computer language among developers.
By the mid-1980s, the mania for programming personal computers had subsided in
the wake of running professional software created by others. Developers also had
more options, such as the new computer languages of C and C++. But the
introduction of Visual Basic, written by Microsoft, in 1991, changed that. VB was
based on BASIC and relied on some of its commands and structure, and proved
valuable in many small business applications. BASIC .NET, released by Microsoft in
2001, matched the functionality of Java and C# with the syntax of BASIC.
PL/1
(https://en.wikipedia.org/wiki/PL/I#Early_history)
In the 1950s and early 1960s, business and scientific users programmed for different
computer hardware using different programming languages. Business users were
moving from Autocoders via COMTRAN to COBOL, while scientific users programmed
in General Interpretive Programme (GIP), Fortran, ALGOL, GEORGE, and others.
The IBM System/360 (announced in 1964 and delivered in 1966) was designed as a
common machine architecture for both groups of users, superseding all existing IBM
architectures. Similarly, IBM wanted a single programming language for all users. It
hoped that Fortran could be extended to include the features needed by commercial
programmers. In October 1963 a committee was formed composed originally of
three IBMers from New York and three members of SHARE, the IBM scientific users
group, to propose these extensions to Fortran. Given the constraints of Fortran, they
were unable to do this and embarked on the design of a "new programming
language" based loosely on ALGOL labeled "NPL". This acronym conflicted with that
of the UK's National Physical Laboratory and was replaced briefly by MPPL
(MultiPurpose Programming Language) and, in 1965, with PL/I (with a Roman
numeral "I"). The first definition appeared in April 1964.
IBM took NPL as a starting point and completed the design to a level that the first
compiler could be written: the NPL definition was incomplete in scope and in
detail. Control of the PL/I language was vested initially in the New York Programming
Center and later at the IBM UK Laboratory at Hursley. The SHARE and GUIDE user
groups were involved in extending the language and had a role in IBM's process for
controlling the language through their PL/I Projects. The experience of defining such
a large language showed the need for a formal definition of PL/I.
The goals for PL/I evolved during the early development of the language.
Competitiveness with COBOL's record handling and report writing capabilities was
needed. The "scope of usefulness" of the language grew to include system
programming and event-driven programming. The additional goals for PL/I were:
Performance of compiled code competitive with that of Fortran (but this was not
achieved).
Be extensible, for new hardware and new application areas
Improve the productivity and time scales of the programming process,
transferring effort from the programmer to the compiler
Be machine-independent and operate effectively across the main hardware and
operating system ranges
Usage
PL/I implementations were developed for mainframes from the late 1960s, mini
computers in the 1970s, and personal computers in the 1980s and 1990s. Although
its main use has been on mainframes, there are PL/I versions for DOS, Microsoft
Windows, OS/2, AIX, OpenVMS, and Unix.
It has been widely used in business data processing and for system use for writing
operating systems on certain platforms. Very complex and powerful systems have
been built with PL/I:
The SAS System was initially written in PL/I; the SAS data step is still modeled on PL/I
syntax.
The pioneering online airline reservation system Sabre was originally written for the
IBM 7090 in assembler. The S/360 version was largely written using SabreTalk, a
purpose built subset PL/I compiler for a dedicated control program.
MUMPS
(https://link.springer.com/chapter/10.1007/978-1-4612-3488-3_23)
C
(https://en.wikipedia.org/wiki/C_(programming_language)#Uses)
HISTORY
The origin of C is closely tied to the development of the Unix operating system,
originally implemented in assembly languageon a PDP-7 by Dennis Ritchie and Ken
Thompson, incorporating several ideas from colleagues. Eventually, they decided to
port the operating system to a PDP-11. The original PDP-11 version of Unix was
developed in assembly language. Thompson needed a programming language to
make utilities. At first, he tried to make a Fortran compiler, but soon gave up the
idea and made a new language B language, Thompson's simplified version
of BCPL. However, few utilities were made by B, because B is too slow. Also, B could
not take advantage of some of the PDP-11's features such as byteaddressability.
In 1972, Ritchie started to improve B, which resulted in creating a new language
C[11]. C compiler and some utilities made by C were included in Version 2 Unix.
[12] At Version 4 Unix released at Nov. 1973, the Unix kernel was extensively re-
implemented by C
USES
C++
(https://mathbits.com/MathBits/CompSci/Introduction/history.htm)
In the early 1970s, Dennis Ritchie of Bell Laboratories was engaged in a project to
develop a new operating system. Ritchie discovered that in order to accomplish his
task he needed the use of a programming language that was concise and that
produced compact and speedy programs. This need led Ritchie to develop the
programming language called C.
In the early 1980's, also at Bell Laboratories, another programming language was
created which was based upon the C language. This new language was developed
by Bjarne Stroustrup and was called C++. Stroustrup states that the purpose of C++
is to make writing good programs easier and more pleasant for the individual
programmer. When he designed C++, he added OOP (Object Oriented
Programming) features to C without significantly changing the C component. Thus
C++ is a "relative" (called a superset) of C, meaning that any valid C program is also a
valid C++ program.
APPLICATIONS OF C++
Real-World Applications of C++
1. Games:
C++ overrides the complexities of 3D games, optimizes resource management and
facilitates multiplayer with networking. The language is extremely fast, allows
procedural programming for CPU intensive functions and provides greater control
over hardware, because of which it has been widely used in development of gaming
engines. For instance, the science fiction game Doom 3 is cited as an example of a
game that used C++ well and the Unreal Engine, a suite of game development tools,
is written in C++.
2. Graphic User Interface (GUI) based applications:
Many highly used applications, such as Image Ready, Adobe Premier, Photoshop and
Illustrator, are scripted in C++.
3. Web Browsers:
With the introduction of specialized languages such as PHP and Java, the adoption of
C++ is limited for scripting of websites and web applications. However, where speed
and reliability are required, C++ is still preferred. For instance, a part of Google’s
back-end is coded in C++, and the rendering engine of a few open source projects,
such as web browser Mozilla Firefox and email client Mozilla Thunderbird, are also
scripted in the programming language.
4. Advance Computations and Graphics:
C++ provides the means for building applications requiring real-time physical
simulations, high-performance image processing, and mobile sensor applications.
Maya 3D software, used for integrated 3D modeling, visual effects and animation, is
coded in C++.
5. Database Software:
C++ and C have been used for scripting MySQL, one of the most popular database
management software. The software forms the backbone of a variety of database-
based enterprises, such as Google, Wikipedia, Yahoo and YouTube etc.
6. Operating Systems:
C++ forms an integral part of many of the prevalent operating systems including
Apple’s OS X and various versions of Microsoft Windows, and the erstwhile Symbian
mobile OS.
7. Enterprise Software:
C++ finds a purpose in banking and trading enterprise applications, such as those
deployed by Bloomberg and Reuters. It is also used in development of advanced
software, such as flight simulators and radar processing.
8. Medical and Engineering Applications:
Many advanced medical equipments, such as MRI machines, use C++ language for
scripting their software. It is also part of engineering applications, such as high-end
CAD/CAM systems.
9. Compilers:
A host of compilers including Apple C++, Bloodshed Dev-C++, Clang C++ and MINGW
make use of C++ language. C and its successor C++ are leveraged for diverse
software and platform development requirements, from operating systems to
graphic designing applications. Further, these languages have assisted in the
development of new languages for special purposes like C#, Java, PHP, Verilog etc.
OBJECTIVE C
(https://en.wikipedia.org/wiki/Objective-C#History)
Objective-C was created primarily by Brad Cox and Tom Love in the early 1980s at
their company Stepstone.[5] Both had been introduced to Smalltalk while at ITT
Corporation's Programming Technology Center in 1981. The earliest work on
Objective-C traces back to around that time.[6] Cox was intrigued by problems of
true reusability in software design and programming. He realized that a language like
Smalltalk would be invaluable in building development environments for system
developers at ITT. However, he and Tom Love also recognized that backward
compatibility with C was critically important in ITT's telecom engineering milieu.[7]
Cox began writing a pre-processor for C to add some of the abilities of Smalltalk. He
soon had a working implementation of an object-oriented extension to
the Clanguage, which he called "OOPC" for Object-Oriented Pre-Compiler.[8] Love
was hired by Schlumberger Research in 1982 and had the opportunity to acquire the
first commercial copy of Smalltalk-80, which further influenced the development of
their brainchild.
In order to demonstrate that real progress could be made, Cox showed that making
interchangeable software components really needed only a few practical changes to
existing tools. Specifically, they needed to support objects in a flexible manner, come
supplied with a usable set of libraries, and allow for the code (and any resources
needed by the code) to be bundled into one cross-platform format.
Love and Cox eventually formed a new venture, Productivity Products
International (PPI), to commercialize their product, which coupled an Objective-C
compiler with class libraries. In 1986, Cox published the main description of
Objective-C in its original form in the book Object-Oriented Programming, An
Evolutionary Approach. Although he was careful to point out that there is more to
the problem of reusability than just the language, Objective-C often found itself
compared feature for feature with other languages.
C#
C# is a higher-level language compared to C. It comes with object-oriented features
that C doesn't have. C# is an object oriented language developed by Microsoft.
PASCAL
(Niklaus Wirth: The Programming Language Pascal. 35–63, Acta Informatica, Volume
1, 1971.)
Much of the history of computer language design during the 1960s can be traced to
the ALGOL 60 language. ALGOL was developed during the 1950s with the explicit
goal to be able to clearly describe algorithms. It included a number of features
for structured programming that remain common in languages to this day.
Shortly after its introduction, in 1962 Wirth began working on his dissertation with
Helmut Weber on the Euler programming language. Euler was based on ALGOL's
syntax and many concepts but was not a derivative. Its primary goal was to add
dynamic lists and types, allowing it to be used in roles similar to Lisp. The language
was published in 1965.
By this time, a number of problems in ALGOL had been identified, notably the lack of
a standardized string system. The group tasked with maintaining the language had
begun the ALGOL X process to identify improvements, calling for submissions. Wirth
and Tony Hoare submitted a conservative set of modifications to add strings and
clean up some of the syntax. These were considered too minor to be worth using as
the new standard ALGOL, so Wirth wrote a compiler for the language, which became
known as ALGOL W.
The ALGOL X efforts would go on to choose a dramatically more complex
language, ALGOL 68. The complexity of this language led to considerable difficulty
producing high-performance compilers, and it was not widely used in the industry.
This left an opening for newer languages.
Pascal was influenced by the ALGOL W efforts, with the explicit goals of producing a
language that would be efficient both in the compiler and at run-time, allow for the
development of well-structured programs, and to be useful for teaching
students structured programming. A generation of students used Pascal as an
introductory language in undergraduate courses.
One of the early successes for language was the introduction of UCSD Pascal, a
version that ran on a custom operating system that could be ported to different
platforms. A key platform was the Apple II, where it saw widespread use. This led to
the use of Pascal becoming the primary high-level language used for development in
the Apple Lisa, and later, the Macintosh. Parts of the original Macintosh operating
system were hand-translated into Motorola 68000 assembly language from the
Pascal sources.
The typesetting system TeX by Donald E. Knuth was written in WEB, the
original literate programming system, based on DEC PDP-10 Pascal. Successful
comercial applications like Adobe Photoshop[6] where written in Macintosh
Programmer's Workshop Pascal, while applications like Total
Commander, Skype[citation needed] and Macromedia Captivate were written in
Delphi (Object Pascal). Apollo Computer used Pascal as the systems programming
language for its operating systems beginning in 1980.
Variants of Pascal have also been used for everything from research projects to PC
games and embedded systems. Newer Pascal compilers exist which are widely used.
.NET
(https://dotnet.microsoft.com/learn/dotnet/what-is-dotnet)
.NET is a free, cross-platform, open source developer platform for building many
different types of applications.
With .NET, you can use multiple languages, editors, and libraries to build for web,
mobile, desktop, gaming, and IoT.
Languages
You can write .NET apps in C#, F#, or Visual Basic.
F#
(https://fsharp.org/about/index.html)
Java
(https://books.google.co.in/books?
id=96gMhjaAviMC&printsec=frontcover&dq=Core+Java,+Volume+I+-
+Fundamentals+(9th+edition)+by+Cay+Horstmann+and+Gary+Cornell
%27&hl=en&sa=X&ved=0ahUKEwi-
xKHUjM_hAhWd4HMBHWKsBbkQ6AEIMDAB#v=onepage&q&f=false)
James Gosling, Mike Sheridan, and Patrick Naughton initiated the Java language
project in June 1991.Java was originally designed for interactive television, but it was
too advanced for the digital cable television industry at the time. The language was
initially called Oakafter an oak tree that stood outside Gosling's office. Later the
project went by the name Green and was finally renamed Java, from Java
coffee. Gosling designed Java with a C/C++-style syntax that system and application
programmers would find familiar.
Sun Microsystems released the first public implementation as Java 1.0 in 1996. It
promised "Write Once, Run Anywhere" (WORA), providing no-cost run-times on
popular platforms. Fairly secure and featuring configurable security, it allowed
network- and file-access restrictions. Major web browsers soon incorporated the
ability to run Java applets within web pages, and Java quickly became popular.
PYTHON
(https://docs.python.org/2/faq/general.html)
Here’s a very brief summary of what started it all, written by Guido van Rossum:
I had extensive experience with implementing an interpreted language in the ABC
group at CWI, and from working with this group I had learned a lot about language
design. This is the origin of many Python features, including the use of indentation
for statement grouping and the inclusion of very-high-level data types (although the
details are all different in Python).
I had a number of gripes about the ABC language, but also liked many of its features.
It was impossible to extend the ABC language (or its implementation) to remedy my
complaints – in fact its lack of extensibility was one of its biggest problems. I had
some experience with using Modula-2+ and talked with the designers of Modula-3
and read the Modula-3 report. Modula-3 is the origin of the syntax and semantics
used for exceptions, and some other Python features.
I was working in the Amoeba distributed operating system group at CWI. We needed
a better way to do system administration than by writing either C programs or
Bourne shell scripts, since Amoeba had its own system call interface which wasn’t
easily accessible from the Bourne shell. My experience with error handling in
Amoeba made me acutely aware of the importance of exceptions as a programming
language feature.
It occurred to me that a scripting language with a syntax like ABC but with access to
the Amoeba system calls would fill the need. I realized that it would be foolish to
write an Amoeba-specific language, so I decided that I needed a language that was
generally extensible.
During the 1989 Christmas holidays, I had a lot of time on my hand, so I decided to
give it a try. During the next year, while still mostly working on it in my own time,
Python was used in the Amoeba project with increasing success, and the feedback
from colleagues made me add many early improvements.
In February 1991, after just over a year of development, I decided to post to USENET.
The rest is in the Misc/HISTORY file
HTML
(https://html.com/#What_is_HTML)
What is HTML?
Okay, so this is the only bit of mandatory theory. In order to begin to write HTML, it
helps if you know what you are writing.
HTML is the language in which most websites are written. HTML is used to create
pages and make them functional.
The code used to make them visually appealing is known as CSS and we shall focus
on this in a later tutorial. For now, we will focus on teaching you how to build rather
than design.
The History of HTML
HTML was first created by Tim Berners-Lee, Robert Cailliau, and others starting
in 1989. It stands for Hyper Text Markup Language.
Hypertext means that the document contains links that allow the reader to jump to
other places in the document or to another document altogether. The latest version
is known as HTML5.
A Markup Language is a way that computers speak to each other to control how text
is processed and presented.
CSS
(https://www.w3.org/standards/webdesign/htmlcss)
What is CSS?
CSS is the language for describing the presentation of Web pages, including colors,
layout, and fonts. It allows one to adapt the presentation to different types of
devices, such as large screens, small screens, or printers. CSS is independent of HTML
and can be used with any XML-based markup language. The separation of HTML
from CSS makes it easier to maintain sites, share style sheets across pages, and tailor
pages to different environments. This is referred to as the separation of structure
(or: content) from presentation.
Javascript
(https://web.archive.org/web/20070916144913/http://wp.netscape.com/newsref/
pr/newsrelease67.html)
PHP
(https://www.php.net/manual/en/intro-whatis.php)
PERL
(https://perldoc.perl.org/perlintro.html#What-is-Perl%3f)
SQL
(https://www.britannica.com/technology/SQL)
SQL, in full structured query language, computer language designed for eliciting
information from databases.
In the 1970s computer scientists began developing a standardized way to
manipulate databases, and out of that research came SQL. The late 1970s and early
’80s saw the release of a number of SQL-based products. SQL gained popularity
when the American National Standards Institute (ANSI) adopted the first SQL
standard in 1986. Continued work on relational databases led to improvements in
SQL, making it one of the most popular database languages in existence. Some
large softwarecompanies, such as Microsoft Corporation and Oracle Corporation,
produced their own versions of SQL, and an open-source version, MySQL, became
extremely popular.
SQL works by providing a way for programmers and other computer users to get
desired information from a database using something resembling normal English. On
the simplest level, SQL consists of only a few commands: Select, which grabs data;
Insert, which adds data to a database; Update, which changes information; and
Delete, which deletes information. Other commands exist to create, modify, and
administer databases.
SQL is used in everything from government databases to e-commerce sites on
the Internet. As the popularity of SQL grew, programmers and computer scientists
continued to optimize the way that relational databases work.
NO SQL
(https://searchdatamanagement.techtarget.com/definition/NoSQL-Not-Only-SQL)
NoSQL is an approach to database design that can accomodate a wide variety of data
models, including key-value, document, columnar and graph formats. NoSQL, which
stand for "not only SQL," is an alternative to traditional relational databases in which
data is placed in tables and data schema is carefully designed before the database is
built. NoSQL databases are especially useful for working with large sets of distributed
data.
NoSQL vs. RDBMS
The NoSQL term can be applied to some databases that predated the relational
database management system, but it more commonly refers to the databases built
in the early 2000s for the purpose of large-scale database clustering in cloud and
web applications. In these applications, requirements for performance and scalability
outweighed the need for the immediate, rigid data consistency that the RDBMS
provided to transactional enterprise applications.
Notably, the NoSQL systems were not required to follow an established relational
schema. Large-scale web organizations such as Google and Amazon used NoSQL
databases to focus on narrow operational goals and employ relational databases as
adjuncts where high-grade data consistency is necessary.
Early NoSQL databases for web and cloud applications tended to focus on very
specific characteristics of data management. The ability to process very large
volumes of data and quickly distribute that data across computing clusters were
desirable traits in web and cloud design. Developers who implemented cloud and
web systems also looked to create flexible data schema -- or no schema at all -- to
better enable fast changes to applications that were continually updated.
SWIFT
(https://docs.swift.org/swift-book/)
Swift is a fantastic way to write software, whether it’s for phones, desktops, servers,
or anything else that runs code. It’s a safe, fast, and interactive programming
language that combines the best in modern language thinking with wisdom from the
wider Apple engineering culture and the diverse contributions from its open-source
community. The compiler is optimized for performance and the language is
optimized for development, without compromising on either.
Swift is friendly to new programmers. It’s an industrial-quality programming
language that’s as expressive and enjoyable as a scripting language. Writing Swift
code in a playground lets you experiment with code and see the results immediately,
without the overhead of building and running an app.
Swift defines away large classes of common programming errors by adopting
modern programming patterns:
Variables are always initialized before use.
Array indices are checked for out-of-bounds errors.
Integers are checked for overflow.
Optionals ensure that nil values are handled explicitly.
Memory is managed automatically.
Error handling allows controlled recovery from unexpected failures.
Swift code is compiled and optimized to get the most out of modern hardware. The
syntax and standard library have been designed based on the guiding principle that
the obvious way to write your code should also perform the best. Its combination of
safety and speed make Swift an excellent choice for everything from “Hello, world!”
to an entire operating system.
Swift combines powerful type inference and pattern matching with a modern,
lightweight syntax, allowing complex ideas to be expressed in a clear and concise
manner. As a result, code is not just easier to write, but easier to read and maintain
as well.
Swift has been years in the making, and it continues to evolve with new features and
capabilities. Our goals for Swift are ambitious. We can’t wait to see what you create
with it.