0% found this document useful (0 votes)
118 views6 pages

Lesson 3 Notes - Early History of Multimedia

Uploaded by

stephenjussie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
118 views6 pages

Lesson 3 Notes - Early History of Multimedia

Uploaded by

stephenjussie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

EARLY HISTORY OF MULTIMEDIA

A brief history of the use of multimedia to communicate ideas might begin with newspapers,
which were perhaps the first mass communication medium, using text, graphics, and images.
Before still-image camera was invented, these graphics and images were generally hand-
drawn.

Joseph Nicéphore Niépce captured the first natural image from his window in 1826 using a
sliding wooden box camera. It was made using an 8-h exposure on pewter coated with
bitumen. Later, Alphonse Giroux built the first commercial camera with a double-box design.
It had an outer box fitted with a landscape lens, and an inner box holding a ground glass
focusing screen and image plate. Sliding the inner box makes the objects of different
distances be focused. Similar cameras were used for exposing wet silver-surfaced copper
plates, commercially introduced in 1839. In the 1870s, wet plates were replaced by the more
convenient dry plates. Figure 1.1 (image from author’s own collection) shows an example of a
nineteenth century dry-plate camera, with bellows for focusing. By the end of the nineteenth

Introduction to Multimedia

Fig. 1.1 A vintage dry-plate camera. E&H T Anthony model Champion, circa 1890
Century, film-based cameras were introduced, which soon became dominant until replaced
by digital cameras.

Thomas Alva Edison’s phonograph, invented in 1877, was the first device that was able to
record and reproduce sound. It originally recorded sound onto a tinfoil sheet phonograph
cylinder.

Figure 1.2 shows an example of an Edison’s phonograph (Edison GEM, 1905; image from
author’s own collection).
The phonographs were later improved by Alexander Graham Bell. Most notable
improvements include wax-coated cardboard cylinders, and a cutting stylus that moved from
side to side in a “zig zag” pattern across the record. Emile Berliner further transformed the
phonograph cylinders to gramophone records. Each side of such a flat disk has a spiral groove
running from the periphery to near the center, which can be conveniently played by a turntable
with a tonearm and a stylus. These components were improved over time in the twentieth
century, which eventually enabled quality sound reproducing that is very close the origin.
The gramophone record was one of the dominant audio recording formats throughout much
of the twentieth century. From the mid-1980s, phonograph use declined sharply because of the
rise of audio tapes, and later the Compact Disc (CD) and other digital recording formats.
Figure 1.3 shows the evolution of audio storage media, starting from the Edison cylinder
record, to the flat vinyl record, to magnetic tapes (reel-to-reel and cassette), and modern
digital CD.
Motion pictures were originally conceived of in the 1830s to observe motion too rapid for
perception by the human eye. Edison again commissioned the invention of a motion picture
camera in 1887. Silent feature films appeared from 1910 to 1927; the silent era effectively
ended with the release of The Jazz Singer in 1927.

Fig. 1.2 An Edison phonograph, model GEM. Note the patent plate in the bottom picture, which
suggests that the importance of patents had long been realized and also how serious Edison
was in protecting his inventions. Despite the warnings in the plate, this particular phonograph was
modified by the original owner, a good DIYer 100 years ago, to include a more powerful spring
motor from an Edison Standard model and a large flower horn from the Tea Tray Company

Fig. 1.3 Evolution of audio storage media. Left to right an Edison cylinder record, a flat vinyl
record, a reel-to-reel magnetic tape, a cassette tape, and a CD
In 1895, Guglielmo Marconi conducted the first wireless radio transmission at Pontecchio,
Italy, and a few years later (1901), he detected radio waves beamed across the Atlantic.
Initially invented for telegraph, radio is now a major medium for audio broadcasting. In 1909,
Marconi shared the Nobel Prize for Physics.
Television, or TV for short, was the new medium for the twentieth century. In 1884, Paul
Gottlieb Nipkow, a 23-year-old university student in Germany, patented the first
electromechanical television system which employed a spinning disk with a series of holes
spiraling toward the center. The holes were spaced at equal angular intervals such that, in a
single rotation, the disk would allow light to pass through each hole and onto a light-sensitive
selenium sensor which produced the electrical pulses. As an image was focused on the rotating
disk, each hole captured a horizontal “slice” of the whole image. Nipkow’s design would not be
practical until advances in amplifier tube technology, in particular, the cathode ray tube (CRT),
became available in 1907. Commercially available since the late 1920s, CRT-based TV
established video as a commonly available medium and has since changed the world of mass
communication.

All these media mentioned above are in the analog format, for which the time- varying
feature (variable) of the signal is a continuous representation of the input, i.e., analogous to
the input audio, image, or video signal. The connection between computers and digital media,
i.e., media data represented using the discrete binary format, emerged actually only over a
short period:
1967 Nicholas Negroponte formed the Architecture Machine Group at MIT.
1969 Nelson and van Dam at Brown University created an early hypertext editor called FRESS.
The present-day Intermedia project by the Institute for Research in Information and
Scholarship (IRIS) at Brown is the descendant of that early system.
1976 The MIT Architecture Machine Group proposed a project entitled “Multiple Media.” This
resulted in the Aspen Movie Map, the first videodisk, in 1978.
1982 The Compact Disc (CD) was made commercially available by Philips and Sony, which was
soon becoming the standard and popular medium for digital audio data, replacing the analog
magnetic tape.
1985 Negroponte and Wiesner co-founded the MIT Media Lab, a leading research institution
investigating digital video and multimedia.
1990 Kristina Hooper Woolsey headed the Apple Multimedia Lab, with a staff of
100. Education was a chief goal.
1991 MPEG-1 was approved as an international standard for digital video. Its further development led
to newer standards, MPEG-2, MPEG-4, and further MPEGs, in the 1990s.

Reginald A. Fessenden, of Quebec, beat Marconi to human voice transmission by several


years, but not all inventors receive due credit. Nevertheless, Fessenden was paid $2.5 million
in 1928 for his purloined patents.

Multimedia: Past and Present 9


1991 The introduction of PDAs in 1991 began a new period in the use of computers in general and
multimedia in particular. This development continued in 1996 with the marketing of the first
PDA with no keyboard.
1992 JPEG was accepted as the international standard for digital image compression, which remains
widely used today (say, by virtually every digital camera).
1992 The first audio multicast on the multicast backbone (MBone) was made.
1995 The JAVA language was created for platform-independent application devel- opment, which was
widely used for developing multimedia applications.
1996 DVD video was introduced; high-quality, full-length movies were distributed on a single disk.
The DVD format promised to transform the music, gaming, and computer industries.
1998 Handheld MP3 audio players were introduced to the consumer market, initially with 32 MB of
flash memory.

Hypermedia, WWW, and Internet

The early studies laid a solid foundation for the capturing, representation, compres- sion, and
storage of each type of media. Multimedia however is not simply about putting different
media together; rather, it focuses more on the integration of them so as to enable rich
interaction amongst them, and also between media and human beings.
In 1945, as part of MIT’s postwar deliberations on what to do with all those scientists
employed on the war effort, Vannevar Bush wrote a landmark article describing what
amounts to a hypermedia system, called “Memex.” Memex was meant to be a universally
useful and personalized memory device that even included the concept of associative links—
it really is the forerunner of the World Wide Web. After World War II, 6,000 scientists who
had been hard at work on the war effort suddenly found themselves with time to consider
other issues, and the Memex idea was one fruit of that new freedom.
In the 1960s, Ted Nelson started the Xanadu project and coined the term hypertext. Xanadu
was the first attempt at a hypertext system—Nelson called it a “magic place of literary
memory.”
We may think of a book as a linear medium, basically meant to be read from beginning to
end. In contrast, a hypertext system is meant to be read nonlinearly, by following links that
point to other parts of the document, or indeed to other documents. Figure 1.4 illustrates this
familiar idea.
Douglas Engelbart, greatly influenced by Vannevar Bush’s “As We May Think,”
demonstrated the On-Line System (NLS), another early hypertext program in 1968.
Engelbart’s group at Stanford Research Institute aimed at “augmentation, not automa- tion,” to
enhance human abilities through computer technology. NLS consisted of such critical ideas
as an outline editor for idea development, hypertext links, teleconferencing, word processing,
and email, and made use of the mouse pointing device, windowing software, and help
systems.

Hypermedia, again first introduced by Ted Nelson, went beyond text-only. It includes a
wide array of media, such as graphics, images, and especially the continu- ous media—sound
and video, and links them together. The World Wide Web (WWW or simply Web) is the best
example of a hypermedia application, which is also the largest.
Amazingly, this most predominant networked multimedia applications has its roots in
nuclear physics! In 1990, Tim Berners-Lee proposed the World Wide Web to CERN
(European Center for Nuclear Research) as a means for organizing and sharing their work
and experimental results. With approval from CERN, he started developing a hypertext
server, browser, and editor on a NeXTStep workstation. His team invented the Hypertext
Markup Language (HTML) and the Hypertext Transfer Protocol (HTTP) for this purpose,
too.

Multimedia in the New Millennium


Entering the new millennium, we have witnessed the fast evolution toward a new generation
of social, mobile, and cloud computing for multimedia processing and sharing. Today, the
role of the Internet itself has evolved from the original use as a communication tool to
provide easier and faster sharing of an infinite supply of information, and the multimedia
content itself has also been greatly enriched. High- definition videos and even 3D/multitier
videos can be readily captured and browsed by personal computing devices, and conveniently
stored and processed with remote cloud resources. More importantly, the users are now
actively engaged to be part of a social ecosystem, rather than passively receiving media
content. The revolution is being driven further by the deep penetration of 3G/4G wireless
networks and smart mobile devices. Coming with highly intuitive interfaces and exceptionally
richer multimedia functionalities, they have been seamlessly integrated with online social
networking for instant media content generation and sharing.
Below, we list some important milestones in the development of multimedia in the new
millennium. We believe that most of the readers of this textbook are familiar with them, as we
are all in this Internet age, witnessing its dramatic changes; many readers, particularly the
younger generation, would be even more familiar with the use of such multimedia services as
YouTube, Facebook, and Twitter than the authors. 2000 WWW size was estimated at over one
billion pages. Sony unveiled the first
Blu-ray Disc prototypes in October 2000, and the first prototype player was released in April
2003 in Japan.
2001 The first peer-to-peer file sharing (mostly MP3 music) system, Napster, was shut down by
court order, but many new peer-to-peer file sharing systems, e.g., Gnutella, eMule, and
BitTorrent, were launched in the following years. Cool streaming was the first large-scale
peer-to-peer streaming system that was deployed in the Internet, attracting over one million
in 2004. Later years saw the booming of many commercial peer-to-peer TV systems, e.g.,
PPLive, PPStream, and UUSee, particularly in East Asia. NTT DoCoMo in Japan launched
the first commercial 3G wireless network on October 1. 3G then started to be deployed
worldwide, promising broadband wireless mobile data transfer for multimedia data.
2003 Skype was released for free peer-to-peer voice over the Internet.
2004 Web 2.0 was recognized as a new way to utilize software developers and end-users use the
Web (and is not a technical specification for a new Web). The idea is to promote user
collaboration and interaction so as to generate content in a “virtual community,” as opposed
to simply passively viewing content. Examples include social networking, blogs, wikis, etc.
Facebook, the most popular online social network, was founded by Mark Zuckerberg. Flickr,
a popular photo hosting and sharing site, was created by Ludicorp, a Vancouver-based
company founded by Stewart Butterfield and Caterina Fake.
2005 YouTube was created, providing an easy portal for video sharing, which was purchased by
Google in late 2006. Google launched the online map service, with satellite imaging, real-
time traffic, and Streetview being added later.
2006 Twitter was created, and rapidly gained worldwide popularity, with 500 millionregistered users
in 2012, who posted 340 million tweets per day. In 2012, Twitter offered the Vine mobile app,
which enables its users to create and post short video clips of up to 6 s. Amazon launched its cloud
computing platform, Amazon’s Web Services (AWS). The most central and well-known of
these services are Amazon EC2 and Amazon S3. Nintendo introduced the Wii home video game
console, whose remote controller can detect movement in three dimensions.
2007 Apple launched the first generation of iPhone, running the iOS mobile operating system. Its
touch screen enabled very intuitive operations, and the associated App Store offered
numerous mobile applications. Goolge unveiled Android mobile operating system, along with
the founding of the Open Handset Alliance: a consortium of hardware, software, and
telecommunica- tion companies devoted to advancing open standards for mobile devices. The
first Android-powered phone was sold in October 2008, and Google Play, Android’s
primary app store, was soon launched. In the following years, tablet computers using iOS,

Android, and Windows with larger touch screens joined the eco-system, too.
2009 The first LTE (Long Term Evolution) network was set up in Oslo, Norway, and Stockholm,
Sweden, making an important step toward 4G wireless networking. James Cameron’s film,
Avatar, created a surge on the interest in 3D video.
2010 Netflix, which used to be a DVD rental service provider, migrated its infrastruc- ture to the
Amazon AWS cloud computing platform, and became a major online streaming video provider.
Master copies of digital films from movie studios are stored on Amazon S3, and each film is
encoded into over 50 different ver- sions based on video resolution, audio quality using
machines on the cloud. In total, Netflix has over 1 petabyte of data stored on Amazon’s cloud.
Microsoft introduced Kinect, a horizontal bar with full-body 3D motion capture, facial
recognition, and voice recognition capabilities, for its game console Xbox 360.
2012 HTML5 subsumes the previous version, HTML4, which was standardized in 1997. HTML5
is a W3C “Candidate Recommendation.” It is meant to provide support for the latest
multimedia formats while maintaining consistency for current web browsers and devices, along
with the ability to run on low-powered devices such as smartphones and tablets.
2013 Sony released its PlayStation 4, a video game console that is to be integrated with Gaikai, a
cloud-based gaming service that offers streaming video game content. 4K resolution TV
started to be available in the consumer market.

References

B. Newhall, The History of Photography: From 1839 to the Present, The Museum of Modern
Art (1982)

T. Gustavson, G. Eastman House, Camera: A History of Photography from Daguerreotype to


Digital (Sterling Signature, New York, 2012)
A. Koenigsberg, The Patent History of the Phonograph, (APM Press, Englewood, 1991),
pp. 1877–1912

L.M. David Jr., Sound Recording: The Life Story of a Technology, (Johns Hopkins University
Press, Baltimore, 2006)

Q.D. Bowers, K. Fuller-Seeley. One Thousand Nights at the Movies: An Illustrated History
of Motion Pictures, (Whitman Publishing, Atlanta, 2012), pp. 1895–1915

T.K. Sarkar, R. Mailloux, A.O. Arthur, M. Salazar-Palma, D.L. Sengupta, History of Wireless,
(Wiley-IEEE Press, Hoboken, 2006)

M. Hilmes, J. Jacobs, The Television History Book (Television, Media and Cultural Studies),
(British Film Institute, London, 2008)

N. Yankelovitch, N. Meyrowitz, A. van Dam, Reading and writing the electronic book, in
Hypermedia and Literary Studies, ed. by P. Delany, G.P. Landow (MIT Press, Cambridge,
1991)

You might also like