100% found this document useful (1 vote)
343 views

Module Multimedia Technologies Module1

Uploaded by

Liester Flores
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
343 views

Module Multimedia Technologies Module1

Uploaded by

Liester Flores
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 55

Module 1

 Introduction to multimedia

Objectives

At the end of this module you should be able to do the following:

 State the different media modalities and give examples of how a piece of
information can be represented using different modalities
 Make distinctions between different definitions of the terms "media" and
"multimedia", describe what motivates the different perspectives, and discuss the
difficulties in arriving at a comprehensive definition of multimedia
 Discuss Packer's characterization of multimedia and how it differs from other
definitions of multimedia
 Construct your definition of multimedia

What is (or are) media

Before launching into a study of multimedia instructional design and evaluation


strategies, it seems reasonable to come to grips with what we first mean by “multimedia”
and "media". Definitions of media (and its singular form, medium) are as diverse as the
disciplines they are used in. Consider the definitions of Table 1.1.

(Note: Although "media" is strictly speaking the plural form of medium, media can be
used as a singular noun.)

There is plenty of variety among these definitions; however, there is generally


agreement that a medium carries, represents, or transmits something, but is not that
thing itself. And what is that thing? In biology, it would be microorganisms or organic
compounds, as indicated in Definition 3 of Table 1. For instructional designers and
course developers, it would be information. As makers of educational multimedia
materials, we are constantly trying to figure out what is the most effective way to use
media to convey information. This is not to say that the information only goes from
teacher to student. Regardless of what approaches you tend to take in your practice as
an instructional designer, you would probably agree that the practices of teaching and
learning require information to be conveyed bidirectionally. 

Media modalities

Since this isn't a media studies course but rather a course that deals with problems and
strategies specific to educational material creation and evaluation, we are interested in
specific aspects of media related to design and production. Table 2 lists areas of media
design and production that are relevant to this course. Note that with the exception
of media encoding type, each area references one or more definitions of media given
in Table 1. Each of these areas play an important role in the design and evaluation of
multimedia educational material.

Table 1.2. Areas in media design and production

Category used to Examples Which definitions from Table


distinguish 1 does this area reference
areas of study and
practice
Media modality Text, audio, video, graphics, Definition 4
animation
Media encoding type Digital or analog None
Media storage Optical, magnetic, other Definition 7, first half of definition
material 5
Media transmission Electromagnetic, optical, Definition 2, second half of
strategy physical Definition 5
Mass media TV, radio, print, Internet, Definition 1, Definition 6
classifications telephone/mobile

Table 1.1. Definitions of "medium" and "media" given by academic dictionaries


Source Their definition of Their definition of
"medium" "media"
Dictionary of Media Definition 1: A means of Definition 6: The various
Studies (2006) mass communication, for means of mass
example television, radio communication considered
or newspapers  as a whole, including
television, radio, magazines
Definition 2: the physical and newspapers, together
means of transmitting a with the people involved in
message through a their production
channel of communication
 
 
Hutchinson Dictionary of Definition 3: In Definition 7: In computing,
Science (2006) bacteriology, an the collective name for
environment in which materials on which data can
micro-organisms can be be recorded. For example,
cultured. Common paper is a medium that can
mediums include agar, be used to record printed
broth, and gelatine, often data; a floppy disk is a
with added salts and trace medium for recording
magnetic data.
elements.

 
Encyclopedic Dictionary of Definition 4: any means, ---
Semiotics, Media, and agency, or instrument of
Communications  (2000) communication;  

Definition 5: the physical


means by which a sign or
text is encoded (put
together) and through
which it is transmitted
(delivered, actualized).

Keep Table 2 in mind as we turn our attention to the term "multimedia" and what it could
mean.

Defining multimedia

Arriving at a definition of multimedia is not easy. As an initial try, let us look at the topics
being studied by people who identify themselves as multimedia experts. We look for
people who say, "I am a multimedia expert, and if you want to know what multimedia is,
look what I'm studying." For instance, the peer-reviewed journal, Journal of Multimedia,
considers articles written on the following topics:

 Multimedia Analysis, Processing, and Retrieval


 Signal Processing for Media Applications
 Image and Video Processing, Speech, Audio and Music Processing
 Multimedia Security, Modelling, Coding, and Compression
 Components and Technologies for Multimedia Systems
 Multimedia Tools, Architecture, Systems, and Applications
 Real-time Multimedia System, 3D and Motion, Multiple camera System
 Visualization, Interactive Media and Games, 3D-TV, Stereo Systems,
 Multimedia Content Management, Digital Watermarking, DRM
 Human Factor, Interface, and Interaction
 Multimedia Databases and File Systems
 Multimedia System Integration
 Multimedia Communication and Networking
 Internet Telephony, Peer-to-peer Streaming, Audio/video Streaming
 Multimedia Content Distribution, Wireless Multimedia
 Multimedia Servers, Operating Systems, Middleware and QoS
 Multimedia Education, E-learning, Entertainment, Collaborative Systems
 Artificial Intelligence in Multimedia Technologies
 Standards and Related Issues

This extensive list helps us understand the broad range of academic and industry
interests in multimedia, and we can see that disciplines such as engineering, computing
science, psychology, law, and (yes) education all have a stake in multimedia. What that
list allows us to do is look at definitions proposed by those who identify themselves as
authorities in multimedia. Consider the following passage from Steve
Heath’s Multimedia and Communications Technology (1999):

If there is a term or phrase that has appeared in more diverse publications than
any other over the last few years, it must be multimedia. The number of
definitions for it are as numerous as the number of companies working on it. If
this is the case, what is multimedia?

In essence, it is the use or presentation of data in two or more forms. The


combination of audio and video in the humble television was probably the first
multimedia application, but it has been the advent of the PC, with its ability to
manipulate data from different sources and offer this directly to the consumer or
subscriber, that has sparked the current interest.

As a result, multimedia for many people conjures up the image of a PC with a


SoundBlaster card playing interactive games or searching through an
encyclopedia with all the information supplied on a CD-ROM. While this is
undoubtedly a valid multimedia application, it is only part of the story. This
definition is pragmatic and based on reality; other prophecies for the industry
have ranged from the televisions becoming PCs, and vice versa, and that one
day, everyone will have a combined television-PC-phone-fax, capable of doing
everything you could possibly want and more. Oh and yes, you will never buy
software anymore, as you will be able to access it from your televisual-PC-
phone-fax form the network and only pay for what you need!

The reality is somewhere between the extremes. Undoubtedly, with the ever-
improvising ability of the PC to provide TV quality audio and video, the television
and PC are becoming very close. Add the ability to provide graphical overlays
and the difference is very small indeed. With cable TV companies providing
telephone connections and the increasing combination of PC with a modem to
access the Internet and thus provide an intelligent telephone, the forecasts for
the universal widget are a logical progression. (Heath, 1999)

Compare that passage with this excerpt from Ralf Steinmetz & Clara Nahrstedt’s
book, Multimedia Systems (2004):

Multimedia is probably one of the most overused terms of the 90s... The field is
at the crossroads of several major industries: computing, telecommunications,
publishing, consumer audio-video electronics, and television/movie/broadcasting.
Multimedia not only brings new industrial players to the game, but adds a new a
dimension to the potential market… Similarly, not only the segment of
professional audio-video is concerned, but also the consumer audio-video
market, and the associated TV, movie,and broadcasting sectors. 

As a result, it is no surprise when discussing and establishing multimedia as a


discipline to find difficulties in avoiding fuzziness in scope, multiplicity of
definitions, and non-stabilized terminology. When most people refer to
multimedia, they generally mean the combination of two or more continuous
media, that is, media that have to be played during some well-defined time
interval, usually with some user interaction. In practice, the two media are
normally audio and video, that is, sound plus moving pictures. (Steinmetz &
Nahrstedt, 2004)

Finally, consider this passage from Ze-Nian Li & Mark Drew’s book, Fundamentals of
Multimedia (2005):

People who use the term "multimedia" often seem to have quite different, even
opposing, viewpoints. A PC vendor would like us to think of multimedia as a PC
hat has sound capability, a DVD-ROM drive, and perhaps the superiority of
multimedia-enabled microprocessors that understand additional multimedia
instructions. A consumer entertainment vendor may think of multimedia as
interactive cable TV with hundreds of digital channels, or a cable-TV-like service
delivered over a high-speed internet connection. 

A computer science student [...] likely has a more application-oriented view of


what multimedia consists of: applications that use multiple modalities to their
advantage, including text, images, drawing , drawings (graphics), animation,
video, sound (including speech), and, most likely, interactivity of some kind. The
popular notion of "convergence" is one that inhabits the college campus as well
as the culture at large. (Li & Drew, 2005)

Activity 1.1: Reflect on the material presented

Before continuing, answer the following questions for yourself:

1. Based on the preceding excerpts, can you summarize why it has been
difficult to arrive at a consensus of multimedia actually is?
2.  On which points do these three sets of authors agree?

Multimodality and interactivity

There are at least two points to note from the preceding excerpts. First, all three sets of
authors suggest that “multimedia” was mobilized as a buzzword in the 1990s by
technology and entertainment industries to push their respective (and largely profit-
driven) agendas. The authors seem to suggest that there was no clear agreement of
what properly lies under the umbrella of “multimedia”, since the stakeholders in the term
differed in their approaches to integrating and innovating on mass media types, media
transmission strategies, and media storage technologies.

However, we also do see a common thread among all three excerpts that is particularly
useful for our attempt to understand multimedia from an instructional design
perspective. Consider some selected excerpts from these passages, presented in Table
4.

Table 1.3. Selected excerpts from the passages

(Heath, 1999) ... the use or presentation of data in two or more forms.

(Steinmetz & ... the combination of two or more continuous media, that is, media
Nahrstedt, that have to be played during some well-defined time interval,
2004) usually with some user interaction. In practice, the two media are
normally audio and video, that is, sound plus moving pictures.

(Li & Drew, ... a more application-oriented view of what multimedia consists of:
2005) applications that use multiple modalities to their advantage,
including text, images, drawing , drawings (graphics), animation,
video, sound (including speech), and, most likely, interactivity of
some kind.

Table 4 draws out what these authors consider as multimedia when they consider
media modalities. From this admittedly small selection of expert opinions, we can see
that multimedia can be characterized by at least two general features:

1. Simultaneity of media modalities: Multimedia presents data simultaneously in


two or more modalities (such as audio+video, text+images, animation+sound).
2. Interactivity: Users can affect the state of a multimedia product through their
actions, which are mediated by some sort of interface.

(Note that Steinmetz and Nahrstedt's definition requires "continuous media" that have to
be "played during some well-defined time interval", suggesting that multimedia also has
to be time-based. For the purposes of this course, we can safely ignore this restriction.)

Experts in the area of instructional media use the term multimodal to describe the
simultaneous use of two or more media modalities. Multimodal media is not
necessarily interactive; this is important to keep in mind.

To see multimodality in action, consider the subject of Newton's Laws of Motion. The
ideas behind these laws can be explained in any number of ways:
 Video: We could watch a recording professor Walter Lewin at the Massachusetts
Institute of Technology give a lecture in front of a class of undergraduate
students.
 Text: We could read the transcript of Prof. Lewin's lecture. Or we could pick up
any college physics textbook and read about Newton's Laws there.
 Animation: We could watch a video that shows simulated interactions between
physical objects. We could even play with an interactive web application that
shows how Newton's Second Law works.
 Audio: We could buy an audio CD or download an MP3 and listen to a physics
expert discuss Newton's Laws while we're cooking or walking down the street.

Immersion, interactivity, narrativity, hypertextuality

A broader perspective on multimedia was proposed by Randall Packer in an article from


1999, and in the following activity you will read up on and restate his views, and then
integrate them with your own prior understanding of multimedia.
Activity 1.2: Read Randall Packer's ideas about multimedia

1. Read Packer’s 1999 article, Just What Is Multimedia, Anyway? This is available


on http://www.zakros.com/bios/ReadingA.pdf. After reading the article and before
continuing reading this module, answer the following questions for yourself:

 According to Packer, what are the  essential features of multimedia?


Describe them in your own words.
 In your own words, describe Wagner's concept of Gesamtkünstwerk. How
does it apply to contemporary multimedia?
 Based on the readings and your own ideas, devise three versions of your
personal definition of multimedia:
o One that takes less than 20 seconds to explain
o One that takes less than 10 seconds to explain
o One that takes less than 5 seconds to explain
 Which of the following do you think are examples of multimedia? Which are
not? Don't be surprised if you find yourself struggling to justify your
answers. It only goes to illustrate the "fuzziness in scope, multiplicity of
definitions, and non-stabilized terminology" (Steinmetz & Nahrstedt,
2004) in the study and practice of multimedia.
o A television set (i.e., the thing that sits in your living room)
o A noontime television show (i.e., the moving pictures accompanied
by sound that you watch on a TV set)
o The website of a government agency
o The World Wide Web
o The Internet
o An ultrasound machine
o A personal computer
o A programming language
o A first-person shooter game that runs on your personal computer
o An interactive display in a museum about the history of Indonesia
o A vending machine that dispenses train tickets
o A portable MP3 player playing a lecture downloaded from the
Massachusetts Institute of Technology's Open Course Ware site
o A robot butler that serves you coffee in the morning
o A Sony playstation
o An iPod
o A mobile phone

2. Read From Wagner to Virtual Reality (2000), a website by Randall Packer and


Ken Jordan, on http://www.artmuseum.net/w2vr/contents.html. This website
highlights artists that the authors consider important to the history of multimedia.
Don't feel obliged to read through the entire site (although you are very welcome
to do so). A more useful approach would be to only read articles that you find
interesting, and to follow the path that appeals to you most.

 What can you say about the structure of the website? Does it follow a
particular structural pattern? Is it linear? Radial? Hierarchical?
 How does the structure of the website facilitate, hinder, or otherwise affect
your understanding of the history multimedia?

After reading Packer's article, you will notice that in addition to interactivity and the
simultaneous presentation of media modalities, Packer proposes that multimedia is
characterized by immersion, an integration of
disciplines, narrativity, and hypertextuality/hyperlinkedness. Immersion and disciplinary
integration work together to envelop a user in the multi-sensory experience of a
multimedia environment. Creating a sense of a narrative (or perhaps several parallel
ones) in the multimedia environment invites the user to create meaning and therefore
process the content on cognitive and affective levels. Hyperlinkedness allows users (at
least in theory) to find their way through the environment using a navigational logic that
suits them.

Multimedia as tool and outcome

In the introduction of the course, I cited the four stages identified by JISC at which
design decisions need to be made in planning, creating, and delivering a course. At
each of these stages, technology can play an important role in one or both of two ways:

 Digital technologies as tool: Digital technologies can facilitate the design and


production process.
 Digital technologies as outcome: Digital technologies can be the object of the
design and production process.

We also defined multimedia and distinguished it from digital technology in general. So


whereas a programming language such as Java is an example of digital technology, it is
not an example of multimedia. However, Java can be used to create multimedia
products. This distinction is not always clear. Consider a program for creating
slideshows and presentations, such as Microsoft PowerPoint. The user interface (UI) for
Microsoft Powerpoint is shown in Figure 1:

Figure 1.1. Microsoft Powerpoint's UI


Reviewing the properties of multimedia, you see that the Microsoft PowerPoint can be
used to generate multimedia presentations:

 Immersion: The presentations can make use of a combination of text, graphics,


animation, audio, and video.
 Interactivity: The presentations can be programmed to be interactive and
responsive to the users (although it takes a fair bit of work and expert knowledge
to do this). Download and run this slideshow to see an example of interactive
elements in a PowerPoint
presentation: http://sites.google.com/a/upou.edu.ph/edde-221/files/Piano.pps?
attredirects=0&d=1
 Hypertextuality/hyperlinkedness: The presentations can contain links that can
take the viewer to another point in the presentation or even somewhere on the
Web.
 Narrativity: The presentations are often created to convey a specific story.

By these observations, we can easily conclude the PowerPoint program is a tool for
creating multimedia. However, it is also not unreasonable to consider the PowerPoint
program to be a multimedia product in itself:

 Immersion: The UI uses a combination of text and graphics to reveal the


functionality of the program to the user. The UI even uses animation, which you
can see by summoning the now-infamous Office Assistant, known as "Clippy",
from the Help menu.
 Interactivity: The user can alter their view of the UI by selecting any of the
options presented in the menu, toolbars, and editing panes. The blank page in
the center changes its appearance based on the way the user applies text and
formatting options to it.
 Hypertextuality/hyperlinkedness: When selected, certain items in the menu
take the user to another part of the program, such as the Preferences section, or
even to somewhere on the Internet.
 Narrativity: The placement and design of the visual elements recall the layout of
a desk: the "blank page" in the center of the user's visual field invites the user to
interact with it, while various tools (such as "inkpots", "pens", "scissors", and
"glue") are located on the periphery; taken altogether, the visual elements imply a
narrative of production where the user is invited to take this blank page and
transform it into something new and interesting.

Can you see yet another reason why it is not easy to get a firm grasp of what
multimedia is? The line between the tools of production and the products themselves is
a blurred one. However, this slippage is also a useful one, since it allows you to build
(digital) tools for enabling learners to build (digital) products.

Multimedia, new media, rich media

When multimedia first started gaining currency, it was during a point when it was
inconceivable to transmit and receive the quantities and types of data over the Internet
that we now do today. Barfield recounts the history of the term, and how the Internet
changed how we looked at information was delivered:

'Multimedia' used to mean the design of systems authored with tools such as
Macromedia Director and distributed using cd-roms as a carrier medium. In the mid-
1990s the developments surrounding the [I]nternet and the [W]eb... meant that the
focus of multimedia development shifted from the static physica carriers like the cd-rom
to dynamic and updatable delivery methods on the web. even now this shift is getting
mor eand more pronounced, pushed along by the developments on the web, the
increase in bandwidth avialable and the explosion in access by the public. Distribution
based on cd-roms will always have a niche in the market, but the main focus of
multimedia will be online content on the web... Many classical multimedia courses are
introducing their students to the Web and including Web design and construction as part
of the curriculum. This trend will gather pace as courses restructure to follow
developments on the Internet. [emphasis added]

The term new media is now commonly applied to media delivered through the Internet,
while rich media refers specifically to bandwidth-heavy content such as audio and video.

It is, however, also true that access to the Web is not equal, and if you look to the most
marginalized populations of the global society, you will find increasingly spotty Internet
penetration. However, it is often now claimed that mobile phones are revolutionizing the
way people and communities are linked together in the developing world, allowing more
people to have some kind of access (no matter how indirectly) to data located on the
Web. Nowadays, it is becoming increasingly inconceivable and unrealistic
to not consider the capability of students to access (in one way or another) the World
Wide Web when you plan a multimedia-based instructional intervention.

Summary

In this module, we looked at various experts' attempts to define multimedia, a term that
is still very contested. First we looked at we meant by media. Then we looked at how
multimedia is not merely the same multiple media. Throughout the discussion, I hope
you got a glimpse of how personal, professional, academic, and corporate interests
shape the field of study of multimedia. We then highlighted media modalities as salient
to our study of educational multimedia design and evaluation, and spent some time
looking at Randall Packer's ideas around multimedia, which I claim are useful for our
study of educational multimedia design and evaluation. Finally, we looked at a particular
example of a slippage in definitions: multimedia can be both a digital product and a tool
that is used to create a digital product.

Throughout this course, you will find that I will be using the terms "multimedia", "digital
tools", "technology", and "software" more or less interchangeably. If I do, I mostly do it
for reasons of style. I do not want to leave you with the impression that they these terms
identical. Of these terms, "technology" is the broadest. Technology can be seen as a
way of doing something coupled with the tools needed to actually do it. "Digital tools"
can refer to both hardware (such as desktop computers, laptops, mobile devices, and
other tangible electronic objects) and software (which are made out of code and run on
hardware) that rely on data to be encoded in binary form.

I hope that the previous discussion helps provide you with a more workable road map
that you can use to understand what we mean when we talk about multimedia. Note
that the previous discussion really has said nothing about how to design and evaluate
multimedia for instructional purposes. The taxonomies I have outlined here have
nothing to say abot pedagogical value of multimedia. We postpone that discussion for
Module 3, when we classify digital tools and products in a way that should be more
useful for us. Before that, Module 2 looks at issues specific to each media modality as
well as some issues around interactivity.

References and Suggested Reading

 A. and C. Black Publishers. (2006). Dictionary of Media Studies. London: A & C


Black.
 Danesi, M. (2000). Encyclopedic Dictionary of Semiotics, Media, and
Communications.  (P. Coleman, Ed.). Toronto: University of Toronto Press.
 Heath, S. (1999). Multimedia and Communications Technology. Focal Press.
 Helicon. (2006). The Hutchinson Dictionary of Science. Abingdon, England:
Helicon.
 Li, Z., & Drew, M. S. (2005). Chapter 2: Multimedia Authoring and Tools.
In Fundamentals of Multimedia (United States Ed., p. 576). Prentice-Hall of India.
 Packer, R. (1999). Just What Is Multimedia, Anyway? IEEE MultiMedia, 6(1), 11-
13.
 Packer, R., & Jordan, K. (2000). Multimedia - From Wagner to Virtual
Reality. Artmuseum.net. Retrieved April 21, 2008, from
http://www.artmuseum.net/w2vr/contents.html
 Steinmetz, R., & Nahrstedt, K. (2004). Multimedia systems. Springer.

 Digital Media

Objectives

At the end this module, you should be aware of the technical possibilities and limitations
of each of the media modalities described in the module. You should be familiar with the
basic principles that underlie the way all digital media is created so that as standards
and conventions change, you are able to make sense of these changes. You should be
familiar with which applications you can use for creating and editing these media
modalities.

Introduction

In Module 1, I made the case that media modalities (which we defined to be the term
that collectively refers to text, audio, video, graphics, and animation) are salient features
of multimedia that we need to pay attention to. In this module, we'll be taking a look at
each modality. The treatment will necessarily be brief and incomplete, and I encourage
you to check out the bibliography at the end of this module if you are interested in
digging deeper into any of the issues that are raised.

Before I present to you the sections for each of the digital media modalities, I wanted to
point out some issues or features shared by all of these modalities.

All media is stored as a series of bits


A bit is stands a “binary digit”, and it can take only one of two values: 0 or 1. You will
understand how complex information can be stored as a series of bits in this module.
Larger files have more bits than smaller files.

Bits are assembled in groups of eight. Eight bits forms a byte. 1024 bytes form a
kilobyte. 1024 kilobytes form a megabyte. To learn more about units for measuring
digital data, refer to section 12.2 of the UK Open University's course, Introducing ICT
Systems: http://openlearn.open.ac.uk/mod/resource/view.php?id=182500. 
File sizes and bandwidth restrictions can be problematic; compression can help

Media files can take up a lot of disk space. For example, an hour-long video can take up
as much as 100 megabytes of disk space. Sometimes, you cannot but work with large
files. If you do, consider the following:

 If you are transmitting media across a local network or over the Internet,
transmission time will suffer with large files.
 You should always be backing up your files, and backing up large files can take
time (especially on older and slower machines)
 Editing large files can eat up your computer's resources (CPU time and memory),
especially on older computers.

Most digital media can be compressed to save disk space and bandwidth. There are
two general types of compression mechanisms. Lossy compression discard data
irretrievably. This leads to some loss of quality, but depending on how the compression
is performed, this loss may not be noticeable. Lossless compression reduces file sizes
without discarding essential information. This is possible because the way that data is
recorded as bits is not always efficient. Generally, lossy compression can produce more
dramatic file size reductions. Save your media in file formats that use lossless
compression whenever possible, but also remember that lossy compression (if
managed well) can provide significant benefits without affecting the experience of the
user.

Be careful of noise or digital artifacts


Noise is unwanted data or data that carries no useful information, while digital
artifacts are imperfections that result from the recording, editing, compression, and
reproduction of digital media. common among the media modalities.

In digital photography, noise is a grainy effect that occurs because of sub-optimal


lighting conditions, poor choice of shutter speed, imperfections in the camera hardware,
or a combination of these factors. Figure 2.1 compares a noisy image with a non-noisy
image. In sound recording, noise can occur also because of imperfections in the
recording equipment, but also because ambient sound is not filtered out during the
recording process. Often, this ambient sound is a sustained, undifferentiated hum that
has a particular kind of "flavor". Or maybe the better term is "color", because noise is
often categorized according to colors, such as white noise, purple noise, and grey
noise. You can listen to Brown noise here. (See this discussion on Wikipedia on noise
colors for more information about audio noise. Also, whitenoisemp3s.com offers various
background sounds for download; these hour-long tracks include "brown noise",
"refrigerator hum", and "air conditioner hum".)
Figure 2.1. The photo on the left exhibits more noise than the photo on the right.
Photo from Wikimedia commons, licensed under Creative Commons Attribution-Share
Alike 2.5 Generic license

Let's talk about digital artifacts now. A common artifact in JPEG images is the presence
of blocky areas, such as shown in Figure 2.2. JPG is a lossy compression scheme and
it throws away fine details. When those details are discarded, you get blockiness.
Artifacts in digital sound takes various forms, but a common one involves a metallic
quality in the sound of highly compressed MP3 files. Video exhibits a large number of
possible artifacts; see Basith & Done (1996) for a detailed discussion of them.

Figure 2.2. Digital artifacts in JPEG compression. The top image is uncompressed while
the bottom image is compressed.
Images from the Wikimedia Commons, and are licensed under
a Creative Commons Attribution ShareAlike 3.0 license
 

Be careful of proprietary formats


This is particularly true of digital video, but you might run into problems even with text-
based files. For example, Microsoft's newest version of Microsoft Word saves
documents into the .docx format by default. This document format cannot be opened by
another program other than the latest version of Microsoft Word. This presents a
problem for many, many computer users who do not have this version installed (legally,
at least). What is safest to do is to save a document in an earlier version of Word using
the Save as... functionality.

Let's now take a look at each digital media modality. We'll also looking at a hypermedia
and interactivity as part of this discussion.

Activity 2.1. Read these sections before proceeding with this module

2.1. Digital Text


2.2. Digital Images
2.3. Digital Audio
2.4. Digital animation
2.5. Digital Video
2.6. Hypermedia and interactivity

Putting it all together

After reading the sections that individually dealt with each media modality, the next
question you might be asking yourself is: how do I put it all together?
As a browser-based hypertext document on the Web
Consider this course, EDDE 221. This course is presented to you as a series of
webpages written in XHTML, or Extensible Hypertext Markup Language. In order to
quickly put together these pages, I used an online services provided Google called
Google Sites. Using Google Sites, I was able to embed videos and images alongside
written text. In effect, I am doing all my writing and editing online. The advantage is that
I don't need to keep my offline document synchronized with the version online. The
downside is that when I don't have access to the Web, I don't have access to my
document.

You can also edit your hypertext document offline and then upload it to your webhosting
server using FTP, which stands for File Transfer Protocol. A good (and free) FTP
program is Filezilla, while you can get free webhosting from a number of sites. I keep a
list of sites that provide free hosting on http://delicious.com/dmaranan/webhosting. You
have more control over the look and feel of your hypertext document if you go this route,
but you will need to be more proficient with authoring skills, such as HTML authoring,
fomatting using CSS, and using scripting languages such as Javascript.

As a browser-based hypertext document that can be accessed offline


One advantage of putting your document on the Web is that it can be accessed by your
target audience from anywhere in the world... as long as they have an Internet
connection. This may not be the case for your audience, in which case it may be more
useful to produce a document that they can access even when they're offline. A good
authoring tool to use would be eXeLearning (www.exelearning.org). This tool allows you
to produce interactive hypertext documents where you can embed not only sound and
video, but also instructional activities such as quizzes and interactive tools such as an
image magnifier.

As some other kind of document exchange format


HTML and XHTML are hypertext formats that are open standards. This means that
anyone can write software applications that can be used both to create and to read
HTML and XHTML documents without paying royalties to anyone, because no one
owns the standard. In 2008, PDF became another open standard. This was good news
because PDF is a very good document exchange format. Generally speaking, a PDF
document is guaranteed to look the same when you open it in any operating system,
using a wide variety of PDF readers. (Adobe Acrobat Reader is just one PDF
reader; Sumatra PDF and Foxit are two other examples.) What you may not know is
that you can use your PDF editor to embed rich media content, such as video, into your
PDF file.

Note that you can transform practically any document into a PDF file. Transforming a
Microsoft Powerpoint or your Open Office Impress presentation into a PDF file
guarantees that your intended audience will be able to view your document, unless you
are sure that they have Powerpoint or Impress installed on their computer.

As some other multimedia format


One of the most common multimedia formats is Flash. Developing an application in
Flash is not particularly difficult, but to do it well is time-consuming. To see what you can
do with Flash, try playing the well-known Flash-based game
Samorost: http://www.amanita-design.net/samorost-1. Since you are probably just
beginning to develop instructional multimedia, you'll probably not find it necessarily to
develop instructional materials with Flash... yet!

As a standalone multimedia application


You might find that none of the existing file formats or media display/playback software
is sufficient to meet your needs. In this case, you might have to develop a multimedia
application from scratch using sophisticated programming tools such as C++ or Java.
Since you are probably just beginning to develop instructional multimedia, you probably
won't need to do this either. However, you can take other courses that cover multimedia
programming either at UP Open University or elsewhere.

Choosing the right tools

In the sections focusing on specific media modalities, you will notice that I place an
emphasis on free and open source software in this module, but they may not be able to
cover all your needs. Choosing the right software is part of technology selection, which
is an important part of the design process; we discuss technology selection in Module 4.

"Hybrid" forms

The previous discussions around each type of media modality should not deter you from
experimenting with what I loosely called "hybrid" forms. For example, consider Video
2.1. This style of video tutorial was popularized by Common Craft
(www.commoncraft.com). It's video, but it feels a lot like an animation. Implementing this
tutorial using the usual digital animation and graphics tools (like Flash and Photoshop)
would have taken far more time than it did to generate it in this "low-tech" way. If you
already know how to digitally edit video footage, you can create animations with
Common Craft's technique.

Video 2.1. A typical Common Craft-style of video/animation


Example of a Common Craft-type of video
Please click the Link to view the video4
>>https://www.youtube.com/watch?time_continue=3&v=muVUA-
sKcc4&feature=emb_title

Another kind of "hybrid" form is the audio book: instead of relying on vision-centric
methods to present your content to the user, you could choose spoken text as the
primary delivery mechanism for your target audience, particularly if you've ascertained
through a needs-assessment process that this method is optimal for your target
audience. Unlike written text, which allows your audience to quickly scan a document
and figure out where they left off, audio is received by the user in a more linear fashion.
Cross-referencing between learning modules is more difficult to do. However, employing
an audio-centric strategy might allow your user to work on other tasks while listening to
educational material, though whether they can do both without compromising either
depends on factors that may be beyond your control.

References and suggested reading

 Anglin, G. J., Vaez, H., & Cunningham, K. L. (2004). Visual representations and
learning: The role of static and animated graphics. In D. H. Jonassen
(Ed.), Handbook of research on educational communications and
technology (2nd ed., pp. 865–916). Lawrence Erlbaum Associates. Retrieved
from http://institute.nsta.org/scipack_research/AECT_Animated_Graphics_33.pdf
 "Sam and Anita". (n.d.). Comic Sans. Vimeo. Retrieved June 11, 2009,
from http://vimeo.com/1994310
 Barfield, L. (2004). Design for New Media: Interaction Design for Multimedia and
the Web (1st ed.). Addison Wesley.
 Barron, A. E. (2004). Auditory instruction. In Handbook of research on
educational communications and technology (Vol. 2, pp. 949–978). Retrieved
from http://jan.ucc.nau.edu/~etc-c/etc667/2006/readings/Barron-2004-
AuditoryInstruction.pdf
 Bartram, L. (2009). IAT 814: Knowledge Visualization Lecture Slides. Simon
Fraser University.
 Basith, S. A. (1996, May 24). Digital Video : An Introduction. Information
Systems Engineering Department of Computing and Department of Electrical
and Electronic Engineering, Imperial College of London. Retrieved January 30,
2010, from http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol1/sab/article1.html
 Basith, S. A., & Done, S. R. (1996, June 14). Digital Video, MPEG and
Associated Artifacts. Information Systems Engineering Department of Computing
and Department of Electrical and Electronic Engineering, Imperial College of
London. Retrieved February 2, 2010,
from http://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/sab/report.html
 Brief introduction to typography. (n.d.). Western Illinois University. Retrieved
June 11, 2009, from http://www.wiu.edu/art/courses/handouts/type.htm
 Gillespie, J. (2000). Typography. Web Page Design for Designers. Retrieved
November 12, 2008, from http://www.wpdfd.com/issues/23/typography/
 Golan Levin, Kamal Nigam, & Jonathan Feinberg. (2006, February 14). The
Dumpster. Retrieved May 3, 2008,
from http://www.tate.org.uk/netart/bvs/thedumpster.htm
 Hede, A. (2002). Integrated Model of Multimedia Effects on Learning. Journal of
Educational Multimedia and Hypermedia, 11(2), 177.
 Hornak, E. (n.d.). Introduction to Typography. Introduction to Desktop Publishing
- Rochester Institute of Technology. Retrieved June 11, 2009,
 Huang, J. (2010). Digitization of Sound. CMPT 365, Simon Fraser University,
Spring Semester. Retrieved from http://www.sfu.ca/~jha48/notes/6_1.pdf
 Hundhausen, C., & Douglas, S. (2000). Using visualizations to learn algorithms:
should students construct their own, or view an expert's? In Visual Languages,
2000. Proceedings. 2000 IEEE International Symposium on (pp. 21-28).
Presented at the Visual Languages, 2000. Proceedings. 2000 IEEE International
Symposium on. doi:10.1109/VL.2000.874346
 Johnson, R. (2009, January 3). The Gutenburg Diagram in Design. Web Design
Marketing Podcast & Blog. Retrieved June 11, 2009,
from http://www.3point7designs.com/blog/2009/01/03/the-gutenburg-diagram-in-
design/
 Kenney, A. R., Rieger, O. Y., & Entlich, R. (2003, February 20). Moving Theory
Into Practice : Digital Imaging Tutorial. Cornell University Library/Research
Department. Retrieved June 30, 2009,
from http://www.library.cornell.edu/preservation/tutorial/intro/intro-01.html
 Long, B., & Schenk, S. (2002). The digital filmmaking handbook. Cengage
Learning.
 Lowe, R. (2004). Interrogation of a dynamic visualization during
learning. Learning and Instruction, 14(3), 257-274.
doi:10.1016/j.learninstruc.2004.06.003
 Lynch and Horton. (2004, March 5). Typography. Web Style Guide. Retrieved
November 12, 2008, from http://webstyleguide.com/type/index.html
 Lynch, P. J., & Horton, S. (2009). Web Style Guide. Yale University Press.
Retrieved from http://www.webstyleguide.com/wsg3/8-typography/index.html
 MIT News Office. (n.d.). Using the inverse pyramid structure, from "Writing News:
A Quick Primer". Retrieved June 11, 2009,
from http://web.mit.edu/newsoffice/write-news.html#4
 Nelson, T. H., Smith, R. A., & Mallicoat, M. (2007). Back to the future: hypertext
the way it used to be. In Proceedings of the eighteenth conference on Hypertext
and hypermedia (p. 228). Retrieved
from http://xanadu.com/XanaduSpace/btf.htm
 Nelson, T. H. (1965). Complex information processing. In Proceedings of the
1965 20th national conference on - (pp. 84-100). Presented at the the 1965 20th
national conference, Cleveland, Ohio, United States. doi:10.1145/800197.806036
 Poynton, C. (1996). Chapter 1, Basic Principles. In A technical introduction to
digital video. New York: J. Wiley. Retrieved
from http://www.poynton.com/PDFs/TIDV/Basic_principles.pdf
 Robinson, D. H., & Schraw, G. (2008). Recent innovations in educational
technology that facilitate student learning. IAP. Retrieved from
http://books.google.com/books?id=6DukxLc-8qcC&lr=&source=gbs_navlinks_s
 Ryan, T. A., & Schwartz, C. B. (1956). Speed of perception as a function of mode
of representation. The American Journal of Psychology, 69(1), 60–69.
 Savage, T. M., & Vogel, K. (2008a). Text. In An Introduction to Digital
Multimedia (1st ed.). Jones & Bartlett Publishers. Retrieved
from http://sites.google.com/a/upou.edu.ph/edde-221/files/Savage
%26Vogel_Intro.Digital.Multimedia.Text.PWD.pdf?attredirects=0&d=1
 Savage, T. M., & Vogel, K. (2008b). Animation. In An Introduction to Digital
Multimedia (1st ed., pp. 199-208). Jones & Bartlett Publishers. Retrieved
from http://sites.google.com/a/upou.edu.ph/edde-221/files/Savage
%26Vogel_Intro.Digital.Multimedia.Text.PWD.pdf?attredirects=0&d=1
 Schnotz, W., & Rasch, T. (2002). Enabling, Facilitating, and Inhibiting Effects in
Learning from Animated Pictures. Proceedings of the International Workshop on
Dynamic Visualizations and Learning. Retrieved from http://www.iwm-
kmrc.de/workshops/visualization/schnotz.pdf
 Sweller, J. (2002). Visualisation and Instructional Design. Proceedings of the
International Workshop on Dynamic Visualizations and Learning, 1501-1510.
 Tam, K. (2006). Digital typography : a primer. Retrieved
from http://keithtam.net/documents/keithtam_digital_type_primer.pdf
 Tuovinen, J. E. (2001). Cognition research basis for instructional multimedia.
In Design and management of multimedia information systems (pp. 323-335). IGI
Publishing. Retrieved from http://books.google.com/books?id=-XF--
zCwhiEC&lpg=PP1&pg=PA146#v=onepage&q=&f=false
 United States Library of Congress. (2009). Alphabetical List of Audio Formats.
Retrieved January 29, 2010,
from http://www.digitalpreservation.gov/formats/fdd/browse_list.shtml
 Uttal, D. H., & O'Doherty, K. (2008). Comprehending and Learning from
'Visualizations': A Developmental Perspective. In J. K. Gilbert, M. Reiner, & M.
Nakleh (Eds.), Visualization: Theory and Practice in Science Education, Models
and Modeling in Science Education (Vol. 3, pp. 53-72). Springer. Retrieved
from http://books.google.com/books?id=35Ik6jgavIIC&source=gbs_navlinks_s
 Warde, B. (1956). The crystal goblet, or printing should be invisible. The Crystal
Goblet: Sixteen Essays on Typography. New York. Retrieved
from http://www.d.umn.edu/~jjacobs1/PhD/papers/Warde/THE%20CRYSTAL
%20GOBLET.pdf
 Uses of Multimedia

Introduction

In this module, we enumerate some multimedia tools and multimedia products and
classify them according to some useful typologies. Before we do, recall how the course
introduction emphasized that EDDE 221 would concentrate on design and evaluation of
multimedia for learning objects instead of the role that multimedia plays in the three
other stages of planning and delivering instructional material. However, you should keep
in the back of your mind that multimedia does play a role along multiple levels, and
many of the tools and products discussed in this module are significant in other stages
of design and delivery of instructional material.

In Module 1, we distinguished the tools for producing multimedia from multimedia


products themselves, but we also saw how multimedia products can be used to produce
other multimedia products. Module 1 also emphasized the distinction between "true"
multimedia and the use of the term "multimedia" to cover various kinds of digital media.
In this module and for most of the course, little or no distinctions will be made between
multimedia tools and products. Further, we will make little or no distinctions between
true multimedia and particular kinds of digital media.

Digital tools and products in education

Classifying tools and products by media used


A bewildering array of digital tools and products face today's instructional material
developers. One way to get a handle on all of the choices is to classify these tools and
products into useful categories.

The most obvious classification scheme organizes tools and products according to the
various types of media that they use. Table 1 does precisely this and lists the digital
tools and products mentioned in Appendix 3 of Beetham & Sharpe (2007), which we
discuss in more length in the next section. See Table 1 before proceeding (clicking the
link will open a new window).

However, this classification scheme is of limited usefulness to instructional material


developers, because it doesn't provide enough guidance on effective use. When should
audio be used? Why should learners be required to keep a blog? In which
circumstances does a wiki support learning? As Laurillard (2003) notes, "[a]
classification system that starts by classifying what there is will fail to address a
pedagogical ideal."
Classifying tools and products by services they provide: Roblyer's Typology
In a detailed and comprehensive (but, unfortunately, visually ill-designed) manual for
technology integration, M.D. Roblyer prefaces a discussion on instructional software by
providing more recent alternatives to the term computer-assisted instruction (CAI),
which was used in the "early days... when instructional software was used primarily to
tutor students [instead of being used to] support, rather than actually deliver, instruction"
(Roblyer, 77:2005). More recent variants include computer-based instruction, computer-
based learning, computer-assisted learning, or software learning tools. Roblyer also
makes a distinction between specialized instructional software and the "basic three
software tools": word processing, spreadsheet, and database management software
(Roblyer, 120:2005). Table 2 summarizes chapter 3 and 4 of Roblyer's book that
discusses each type of software corresponding to her typology.

Table 2. Roblyer's classification of instructional software, based on Chapter 3 & 4 of


Roblyer (2005).
Type of Description Some sub-types Examples
Instructiona
l Software
Drill-and- "Provides exercises  Flash card MacGAMUT®
practice in which students activities www.macgamut.com
software work example items,  Branching drills
functions usually one at a
time, and receive Extensive feedback
feedback for their activities
correctness."
Tutorial "An entire  Linear tutorials BasicAlgebraShapeup®
software instructional
functions sequence similar to Branching tutorials
a teacher's
classroom www.meritsoftware.com /
instruction on a
topic. This software/basic_algebra_
instruction usually is shape_ up/index.php
expted to be a self
instructional unit,
rather than a
supplement to other
instruction."
Simulation "A simulation is a  Simulations that SimCity®
software computerized model teach something simcitysocieties.ea.com
functions of a real or imagined o Physical
system that is simulation
designed to teach s
how the system o Iterative
works. Unlike tutorial simulation
and drill-and- s
practice activities in  Simulations that
which the teaching teach how to do
structure is built into
the package, something
learners usually o Procedural
must choose tasks simulation
to do and the order s
in which to do them."
Situational simulations
Instructional "Instructional games TheBloodTypingGame
game are software www.nobelprize.org/
software designed to increase educational_
functions motivation by adding games/medicine/landsteiner/
game rules and/or landsteiner.html
competition to
learning activities."
Problem- "Although  Content-area AlienRescue
solving simulations and skills alienrescue.edb.utexas.edu /
software instructional games about.php
functions are often used to Content-free skills
help teach problem-
solving skills,
problem-solving
software is designed
especially for this
purpose."
Integrated "ILSs are systems   PLATO®
learning that offer computer- www.plato.com
systems based instruction
and other resources
to support
instruction, along
with summary
reports of student
progress through the
instruction; all are
provided through
networked or online
sources."

Most of these tools should be familiar to you. Integrated learning systems as Roblyer
has defined them, however, may not be. ILS are similar to course management
software such as Moodle, but broader in scope. They can come prepackaged with
instructional content, including instructional objectives, lessons integrated into standard
curricula, educational software for each grade level, and a management system (Bailey
and Lumley, 1991, cited in Roblyer, 2005). A look through the PLATO website, for
example, tells you that PLATO software solutions tailor the ILS they deliver to their
USA-based clients to match state and national standards, including standardized tests.

Roblyer also gives a detailed overview in Chapter 4 of her book of how to use what she
calls the “basic three” software (spreadsheet software, word processing software, and
database software) for instructional purposes. In Chapter 5, she discusses software that
is not directly used in instruction but can be used to support teaching and learning in
other ways, including the following:

 Data collection and analysis software


 Graphics software
 Planning and organizing software
 Research and reference software
 Software for supporting specific content areas

You can visit the companion website of Roblyer's book by going to the URL listed in the
Bibliography section. The website contains many additional resources and links that you
might find useful for integrating technology in your practice.
Classifying tools and products by the learning experiences they support (Laurillard's
media forms)
In her 2003 book, Rethinking University Teaching: A Conversational Framework for the
Effective Use of Learning Technologies, Diana Laurillard discusses how educational
technology can support learning through the lens of what she calls a conversational
framework. Under this framework, learning is refined through a continuous process of
dialogue, which can take place between teacher and student or through "the student's
own internal dialogue" (Laurillard, 2003: 88). A diagram from the book, representing the
activities embedded in the framework, is reprinted as Figure 1.

Figure 1. The Conversational Framework (from Laurillard 2003)


Extensive discussion of this framework is beyond the scope of the course. What is
important for us to note is that out of this typology, Laurillard classifies digital tools and
products as narrative, interactive, communicative, adaptive, or productive. (To
avoid confusion, "media types" will refer to text, audio, graphics, animation, and video,
while "Laurillard media forms" will refer to her typology.) Each of the Laurillard media
forms supports a particular set of learning experiences, as outlined in Table 3:

Table 3. Five principal media forms with the learning experiences they support and the
methods used to deliver them (from Laurillard, 2003: 90)

 Learning experience  Methods/technologie  Media forms


s
 Attending, apprehending  Print,TV, video, DVD  Narrative
 Investigating, exploring  Library, CD, DVD, Web  Interactive
resources
 Discussing, debating  Seminar, online  Communicative
conference
 Experimenting, practising  Laboratory, field trip,  Adaptive
simulation
 Articulating, expressing  Essay, product,  Productive
animation, model
 
Note in Laurillard's typology, particular methods/technologies can be used to deliver a
variety of media types. For instance, print-based forms, CDs, and Web resources can
all deliver text. Furthermore, significant overlaps can be found between the methods
enumerated. Essays, for instance, can be drafted using, submitted through, and
assessed over the Web. Sophisticated digital and online tools for creating models are
available to today's learners. Note also that when Rethinking University Teaching was
published in 2003, the possibility of the online and digital spaces as parallel to and
imitative of the physical world had not been fully articulated nor explored as it is now. It
is now entirely reasonable to explore the Web as a method for facilitating almost all the
learning experiences noted by Laurillard. Visiting a virtual museum can be an important
adjunct or even a reasonable substitute to a trip to a physical museum. New kinds of
digital literacies enable an increasingly large number of people to express their ideas
online (see, for example, McCann 2008). The convergence of various technologies is
partly responsible for this shift, typified by gaming consoles which can browse the Web,
mobile phones that can display PDF documents, and laptops that can be fitted with SIM
cards to enable them to send receive SMS messages.

Beetham & Sharpe expand Laurillard's table in Appendix 3 of their book, Rethinking


Pedagogy for a Digital age: Designing and Delivering E-Learning (2007). In addition to
listing more digital tools and products, Beetham & Sharpe summarize key points in
Laurillard's discussion of media forms. An evaluation of narrative media forms, for
example, should take into account not only learners' ability to understand
representations of information but also their ability to create new representations.

Activity 3.1: Before continuing, read “Appendix 3: A typology of digital tools and


resources for learning” from Rethinking Pedagogy for a Digital age: Designing
and Delivering E-Learning by Helen Beetham and Rhona Sharpe (2007)
at www.jisc.ac.uk/uploaded_documents/Technologies%20typology.doc. Then
answer the following questions for yourself:

 How does this Appendix fill in knowledge gaps left in Table 2?


 Beetham & Sharpe list both advantages and risks posed by each
Laurillard media form. Which of these have you encountered in your own
practice?
 Recall the digital tools and products you have used in developing
instructional material in the past. Looking at the mapping between the
Laurillard media forms (which the authors list as "technology type") and
digital tools and products ("electronic and mobile examples"), do you feel
that you have effectively used digital tools and products to facilitate
learning experiences? How would you revise choices you have made in
selecting and deploying multimedia for instructional material design and
delivery, based on the information presented in this Appendix?

There are several points to be made about Laurillard's classification scheme and
Beetham & Sharpe's use of it in their own typology. Of all the media forms they discuss,
Beetham & Sharpe's summary of the advantages, risks, and examples of
communicative media forms is the most detailed, comprehensive, and straightforward.
The other forms, however, seem to constantly be on the verge of slipping into each
other's areas of responsibility, so to speak, a slippage that is caused mostly by
slippages in ever-changing definitions of interaction and information.
Narrative media and productive media
In Laurillard's original formulation, narrative media is not only linear (which is what
makes the media narrative) and often time-based, but the information flows only in the
direction from the teacher to the student. (Laurillard is highly critical of narrative media
that prevents users from controlling the speed and manner of learning: live lectures are
the worst, in her analysis, because the student is forced to sit through them.) Beetham
& Sharpe have relaxed Laurillard's definition of narrative media, and merely require the
media to be representative of something. They have also added an additional
distinction within narrative media: narrative media for reception (where information flows
from teacher to student) and narrative media for production (where information flows
from student to teacher).  "Production", in the sense of narrative media, pertains to the
ability to produce media, and should not be confused with Laurillard's productive media
form, which Beetham & Sharpe have instead applied to refer to systems that manipulate
data. In fact, "productive media" seems to be a misnomer for what could be more simply
called data manipulation media or technologies.
Interactive media and adaptive media
Laurillard's original definition of interactive media is predicated on interactivity as the
ability to navigate hypermedia and hyperlinked documents. The obvious challenge of
using hypermedia is avoiding "information overload" by presenting the student with too
many links. (Beetham & Sharpe 2007, Parlangeli 1999). But the more subtle danger
hinges on the student's reliance on received structures in hypermedia and hyperlinked
documents. Laurillard observes pointedly:

"[W]e do not typically create the links. We follow the links created for us... The
presentational qualities of hypermedia are better suited to the focused, goal-oriented
gathering of information and ideas by the student who has their own narrative in mind...
[W]ithin an educational experience provided by a non-linear narrative medium, such as
hypermedia, we must take care to help learners maintain their own narrative line"
(Laurillard, 2003)

However, Beetham & Sharpe have taken interactivity to be closer to the idea
of information retrieval and have located navigation as a concern in narrative media
forms. This makes some sense because navigation through an information space, after
all, is always a critical issue regardless of whether the information space is linear,
hierarchical, or heterogenously-connected (i.e., rhizomatic qua Deleuze and Guattari).
This is why they Beetham & Sharpe have classified webpages (which are almost always
hyperlinked in nature) as narrative instead of interactive in form.

Though Beetham & Sharpe define interaction in a broader sense than does Laurillard---
interaction "return[s] information based on user input" (Beetham & Sharpe, 2007)---they
both rely on a specific definition of information, in that the information returned, in
essence, has to be text- or graphics-based, and satisfies some kind of problem or
query. But information does not have to be restricted as such. To illustrate, consider
virtual-reality-based surgery simulations through haptic interfaces by Arbabtafti et al
(2008), among others. A haptic interface provides force feedback to a user; in this case,
a user learns to wield a specially-equipped "surgery blade" that responds to a virtual
model of a bone and that provides the user with the appropriate amount of resistance as
she cuts through virtual skeletal structures. Using this highly-specialized learning tool,
surgeons learn to perform bone surgery in a safe and realistic manner.

This simulation (which is what this tool is) would be classified as an example of adaptive
media. But if you think about it, the system certainly returns information (in this case,
force feedback) in response to user input (user movement). However, this tool cannot
be considered narrative or representational. It should be noted that in the fields of user
interface design and computing science, what Laurillard and Beetham & Sharpe call
adaptive media can be collectively referred to as immersive (or virtual) environments.

Consider also interactive assessment systems that "adapt" to student performance.


Moodle, for example, supports adaptive questions "that can change themselves in
response to a student's answer" (Moodle.org, n.d.). The online documentation for
Moodle quotes the IMS Question and Test Interoperability specification in defining
adaptive questions:

"An adaptive item is an item that adapts either its appearance, its scoring (Response
Processing) or both in response to each of the candidate's attempts. For example, an
adaptive item may start by prompting the candidate with a box for free-text entry but, on
receiving an unsatisfactory answer, present a simple choice interaction instead and
award fewer marks for subsequently identifying the correct response. Adaptivity allows
authors to create items for use in formative situations which both help to guide
candidates through a given task while also providing an outcome that takes into
consideration their path."(Ibid.)

Systems that supportive adaptive quizzes cannot easily be placed in the same category
as virtual worlds and simulations. But they are clearly more sophisticated than simple
computer-assisted assessment systems.

A renaming of types
The previous discussion suggests an alternative set of nomenclature, listed in Table 4,
to Laurillard's classification scheme as applied specifically to Beetham & Sharpe's
typology. Laurillard's original nomenclature confuses because the terms alternately
describe what the technologies do (adapt to users, produce new data, interact with user
queries), what the technologies produce in the act of using them ( narratives), and what
the users can do with the technologies (communicate with other people).

Table 4. Proposed alternative nomenclature to be used in conjunction with Beetham &


Sharp's typology
Features  Laurillard Alternative nomenclature
media form
Systems that support representation Narrative  Representational media
(and navigation of those
representations) in various forms

Systems that support communication Communicative  Communicative (or


between individuals and groups communication) technologies

Systems that return information Interactive  Data retrieval systems


based on user input +
Interactive assessment tools

Systems that manipulate data Productive Data manipulation tools

Systems that adapt continuously to Adaptive Immersive environments


user input (because they simulate
real or complex conditions)

Authoring + Assessment Tools


There is a category of instructional multimedia tools that is worth mentioning
specifically: virtual learning environments or standalone software packages that allow
you both to author mixed-media content and create student assessment activities. The
virtual learning environment Moodle (www.moodle.org) and the software eXeLearning
(www.exelearning.org) are but two examples. You can download eXeLearning for a
variety of operating systems and experiment with developing instructional media with it.

A cross-typology list of examples of digital tools and products


Up till this point, we have been classifying technologies that could be used to deliver
educational content. We have not looked at specific examples at each of them.
However, Table 1 provides a brief discussion for each example mentioned in Beetham
& Sharpe, and a few others not mentioned in their typology.

Partly for fun, but also partly to illustrate a particularly interesting tool, the information in
Table 1 has been incorporated in an interactive, hyperlinked, loosely-hierarchical
mindmap. It is recommended that you explore this mindmap and note for yourself the
advantages and risks associated with organizing information in such a manner, from the
standpoint of someone creating a mindmap and reading someone else's mindmap.

Activity 3.2: Modifying or creating a typology for digital tools

Now that this module has provided you a typology (Beetham & Sharpe's application
of Laurillard's media forms) and a set of examples (listed in both Table 1 and
Appendix 1)...

 ... can you apply another existing typology or create a new typology that can
be used to classify the set of examples provided?
 ... can you add more digital tools and products to the typology discussed in
this module?

Bibliography and further reading


 Beetham, Helen, and Rhona Sharpe. 2007. Rethinking Pedagogy for a Digital
age: Designing and Delivering E-Learning. 1st ed. London: Routledge.
 Laurillard, Diana. 2002. Rethinking University Teaching: A Conversational
Framework for the Effective Use of Learning Technologies. 2nd ed. London:
Routledge/Falmer.
 McCann, U. 2008. “Universal McCann Social Media Tracker Wave 3.” Universal
McCann, New York. Available
at http://www.universalmccann.com/Assets/2413%20-%20Wave
%203%20complete%20document%20AW%203_20080418124523.pdf
 Barrett, H. 2004. “Differentiating electronic portfolios and online assessment
management systems.” In Proceedings of the 2004 Annual Conference of the
Society for Information Technology in Teacher Education, Available
at: http://electronicportfolios.com/portfolios/SITE2004paper.pdf.
 Arbabtafti, M. et al. 2008. “Haptic and Visual Rendering of Virtual Bone Surgery:
A Physically Realistic Voxel-Based Approach.” In IEEE International Workshop
on Haptic Audio visual Environments and Games, 2008. HAVE 2008, , p. 30–35.
Available at http://abbas-
rahimi.com/index_files/Haptic_and_Visual_Rendering.pdf
 Parlangeli, Oronzo, Enrica Marchigiani, and Sebastiano Bagnara. 1999.
“Multimedia systems in distance education: effects of usability on
learning.” Interacting with Computers 12(1): 37-49. Available
at: http://sites.google.com/a/upou.edu.ph/edde-221/files/Parlangelietal.-1999-
Multimediasystemsindistanceeducationeffects.pdf?attredirects=0
 Moodle.org. “Adaptive questions.” Available
at: http://docs.moodle.org/en/Adaptive_questions. 
 Roblyer, M. D. (2005). Integrating Educational Technology into Teaching (4th
ed.). Prentice Hall. Retrieved
from http://wps.prenhall.com/chet_roblyer_integrate_4/ 

 Authoring Tool (Digital and Video cameras)


Definition

Multimedia authoring is a process of assembling different types of media contents like


text, audio, image, animations and video as a single stream of information with the help
of various software tools available in the market. Multimedia authoring tools give an
integrated environment for joining together the different elements of a multimedia
production. It gives the framework for organizing and editing the components of a
multimedia project. It enables the developer to create interactive presentation by
combining text, audio, video, graphics and animation.

Features of Authoring Tools


 Editing Features- Most authoring environment and packages exhibit capabilities
to create edit and transform different kinds of media that they support. For
example, Macromedia Flash comes bundled with its own sound editor. This
eliminates the need for buying dedicated software to edit sound data. So
authoring systems include editing tools to create, edit and convert multimedia
components such as animation and video clips.
 Organizing Features- The process of organization, design and production of
multimedia involve navigation diagrams or storyboarding and flowcharting.
Some of the authoring tools provide a system of visual flowcharting or overview
facility to showcase your project's structure at a macro level. Navigation
diagrams help to organize a project. Many web-authoring programs like
Dreamweaver include tools that create helpful diagrams and links among the
pages of a website.
 Visual programming with icons or objects- It is simplest and easiest
authoring process. For example, if you want to play a sound then just clicks on
its icon.
 Programming with a scripting language- Authoring software offers the ability
to write scripts for software to build features that are not supported by the
software itself. With script you can perform computational tasks - sense user
input and respond, character creation, animation, launching other application
and to control external multimedia devices.
 Document Development tools- Some authoring tools offers direct importing of
pre-formatted text, to index facilities, to use complex text search mechanism and
to use hypertext link-ing tools.
 Interactivity Features- Interactivity empowers the end users to control the
content and flow of information of the project. Authoring tools may provide one
or more levels of interactivity.
 Simple branching- Offers the ability to go to another section of the multimedia
production.
 Conditional branching- Supports a go to base on the result of IF-THEN
decision or events.
 Playback Features- When you are developing multimedia project, you will
continousally assembling elements and testing to see how the assembly looks
and performs. Therefore authoring system should have playback facility.
 Supporting CD-ROM or Laser Disc Sources- This software allows over all
control of CD-drives and Laser disc to integrate audio, video and computer files.
CD-ROM drives, video and laserdisc sources are directly controlled by authoring
programs.
 Supporting Video for Windows- Videos are the right media for your project
which are stored on the hard disk. Authoring software has the ability to support
more multimedia elements like video for windows.
 Hypertext- Hypertext capabilities can be used to link graphics, some animation
and other text. The help system of window is an example of hypertext. Such
systems are very useful when a large amount of textual information is to be
represented or referenced.
 Cross-Platform Capability- Some authoring programs are available on several
platforms and provide tools for transforming and converting files and programs
from one to the other.
 Run-time Player for Distribution- Run time software is often included in
authoring software to explain the distribution of your final product by packaging
playback software with content. Some advanced authoring programs provide
special packaging and run-time distribution for use with devices such as CD-
ROM.
 Internet Playability- Due to Web has become a significant delivery medium for
multimedia, authoring systems typically provide a means to convert their output
so that it can be delivered within the context of HTML or DHTML.

Authoring Tools Classification

Card or Page based authoring tools


In these authoring systems, elements are organized as pages of a book or a stack of
cards. In the book or stack there are thousands of pages or cards available. These
tools are best used when the bulk of your content consists of elements that can be
viewed individually, for example the pages of a book or file cards in card file. You can
jump from page to page because all pages can be interrelated. In the authoring system
you can organize pages or cards in the sequences manner. Every page of the book
may contain many media elements like sounds, videos and animations.
One page may have a hyperlink to another page that comes at a much later stage and
by clicking on the same you might have effectively skipped several pages in between.
Some examples of card or page tools are:
 Hypercard (Mac)
 Tool book (Windows)
 PowerPoint (Windows)
 Supercard (Mac)
Advantages
Following are the advantages of card based authoring tools.
 Easy to understand.
 One screen is equal to 1card or 1page.
 Easy to use as these tools provide template.
 Short development time.
Disadvantages
Following are the disadvantages of card based authoring tools.
 Some run only on one platform.
 Tools not as powerful as equivalent stand alones.
Icon based or Event driven authoring tools
Icon-based tools give a visual programming approach to organizing and presenting
multimedia. First you build a structure or flowchart of events, tasks and decisions by
dragging appropriate icons from a library. Each icon does a specific task, for example-
plays a sound, open an image etc. The flowchart graphically displays the project's
logic. When the structure is built you can add your content text, graphics, animation,
video movies and sounds. A nontechnical multimedia author can also build
sophisticated applications without scripting using icon based authoring tools. Some
examples of icon based tools are:
 Authorware Professional (Mac/Windows)
 Icon Author (Windows)
Advantages:
Following are the advantages of icon/event based authoring tools.
 Clear Structure.
 Easy editing and updating
Disadvantages:
Following are the disadvantages of icon/event based authoring tools.
 Difficult to learn.
 Expensive.
Time based authoring tools
Time based authoring tools allow the designer to arrange various elements and events
of the multimedia project along a well defined time line. By time line, we simply mean
the passage of time. As the time advances from starting point of the project, the events
begin to occur, one after another. The events may include media files playback as well
as transition from one portion of the project to another. The speed at which these
transitions occur can also be accurately controlled. These tools are best to use for
those projects, wherein the information flow can be directed from beginning to end
much like the movies. Some example of Time based tools are:
 Macromedia's Director
 Macromedia Flash
Advantages
Following are the advantages of time based authoring tools.
 Good for creating animation.
 Branching, user control, interactivity facilities.
Disadvantages
Following are the disadvantages of time based authoring tools.
 Expensive
 Large file size
 Steep learning curve to understand various features.
Object-Oriented authoring tools:
Object oriented authoring tools support environment based on object. Each object has
the following two characteristics:
1. State or Attributes - The state or attributes refers to the built in characteristics
of an object. For example, a color T.V has the following attributes:
o Color receiver
o Volume control
o Picture control
o 128 channels
o Remote control unit
2. Behavior or Operations - The behavior or operations of an object refers to its
action. For example, a T.V can behave in any of the following manner at a given
point of time:
o Switched on
o Switched off
o Displays picture and sound from
 A TV cable connection
 A TV transmitter
 A DVD
 A VCR
In these systems, multimedia elements events are often treated as objects that
live in a hierarchical order of parent and child relationships. These objects use
messages passed among them to do things according to the properties
assigned to them. For example, a video object will likely have a duration
property i.e how long the video plays and a source property that is the location
of the video file. This video object will likely accept commands from the system
such as play and stop. Some examples of the object oriented tools are:
o mTropolis (Mac/Windows)
o Apple Media Tool (Mac/Windows)
o Media Forge (Windows)

 Image File Format and Sources of Photos and Graphics


An image consists of a rectangular array of dots called pixels. The size of the
image is specified in terms of width X height, in numbers of the pixels. The physical
size of the image, in inches or centimeters, depends on the resolution of the device on
which the image is displayed. The resolution is usually measured in DPI (Dots Per
Inch). An image will appear smaller on a device with a higher resolution than on one
with a lower resolution. For color images, one needs enough bits per pixel to represent
all the colors in the image. The number of the bits per pixel is called the depth of the
image.

Image data types

Images can be created by using different techniques of representation of data called


data type like monochrome and colored images. Monochrome image is created by
using single color whereas colored image is created by using multiple colors. Some
important data types of images are following:
 1-bit images- An image is a set of pixels. Note that a pixel is a picture element
in digital image. In 1-bit images, each pixel is stored as a single bit (0 or 1). A bit
has only two states either on or off, white or black, true or false. Therefore, such
an image is also referred to as a binary image, since only two states are
available. 1-bit image is also known as 1-bit monochrome images because it
contains one color that is black for off state and white for on state.
A 1-bit image with resolution 640*480 needs a storage space of 640*480 bits.
640 x 480 bits. = (640 x 480) / 8 bytes = (640 x 480) / (8 x 1024) KB= 37.5KB.
The clarity or quality of 1-bit image is very low.
 8-bit Gray level images- Each pixel of 8-bit gray level image is represented by
a single byte (8 bits). Therefore each pixel of such image can hold 2 8=256
values between 0 and 255. Therefore each pixel has a brightness value on a
scale from black (0 for no brightness or intensity) to white (255 for full brightness
or intensity). For example, a dark pixel might have a value of 15 and a bright
one might be 240.
A grayscale digital image is an image in which the value of each pixel is a single
sample, which carries intensity information. Images are composed exclusively of
gray shades, which vary from black being at the weakest intensity to white being
at the strongest. Grayscale images carry many shades of gray from black to
white. Grayscale images are also called monochromatic, denoting the presence
of only one (mono) color (chrome). An image is represented by bitmap. A bitmap
is a simple matrix of the tiny dots (pixels) that form an image and are displayed
on a computer screen or printed.
A 8-bit image with resolution 640 x 480 needs a storage space of 640 x 480
bytes=(640 x 480)/1024 KB= 300KB. Therefore an 8-bit image needs 8 times
more storage space than 1-bit image.
 24-bit color images - In 24-bit color image, each pixel is represented by three
bytes, usually representing RGB (Red, Green and Blue). Usually true color is
defined to mean 256 shades of RGB (Red, Green and Blue) for a total of
16777216 color variations. It provides a method of representing and storing
graphical image information an RGB color space such that a colors, shades and
hues in large number of variations can be displayed in an image such as in high
quality photo graphic images or complex graphics.
Many 24-bit color images are stored as 32-bit images, and an extra byte for
each pixel used to store an alpha value representing special effect information.
A 24-bit color image with resolution 640 x 480 needs a storage space of 640 x
480 x 3 bytes = (640 x 480 x 3) / 1024=900KB without any compression. Also
32-bit color image with resolution 640 x 480 needs a storage space of 640 x 480
x 4 bytes= 1200KB without any compression.
Disadvantages
o Require large storage space
o Many monitors can display only 256 different colors at any one time.
Therefore, in this case it is wasteful to store more than 256 different
colors in an image.
 8-bit color images - 8-bit color graphics is a method of storing image
information in a computer's memory or in an image file, where one byte (8 bits)
represents each pixel. The maximum number of colors that can be displayed at
once is 256. 8-bit color graphics are of two forms. The first form is where the
image stores not color but an 8-bit index into the color map for each pixel,
instead of storing the full 24-bit color value. Therefore, 8-bit image formats
consists of two parts: a color map describing what colors are present in the
image and the array of index values for each pixel in the image. In most color
maps each color is usually chosen from a palette of 16,777,216 colors (24 bits:
8 red, 8green, 8 blue).
The other form is where the 8-bits use 3 bits for red, 3 bits for green and 2 bits
for blue. This second form is often called 8-bit true color as it does not use a
palette at all. When a 24-bit full color image is turned into an 8-bit image, some
of the colors have to be eliminated, known as color quantization process.
A 8-bit color image with resolution 640 x 480 needs a storage space of 640 x
480 bytes=(640 x 480) / 1024KB= 300KB without any compression.

Color lookup tables

A color loop-up table (LUT) is a mechanism used to transform a range of input colors
into another range of colors. Color look-up table will convert the logical color numbers
stored in each pixel of video memory into physical colors, represented as RGB triplets,
which can be displayed on a computer monitor. Each pixel of image stores only index
value or logical color number. For example if a pixel stores the value 30, the meaning
is to go to row 30 in a color look-up table (LUT). The LUT is often called a Palette.
Characteristic of LUT are following:
 The number of entries in the palette determines the maximum number of colors
which can appear on screen simultaneously.
 The width of each entry in the palette determines the number of colors which the
wider full palette can represent.
A common example would be a palette of 256 colors that is the number of entries is
256 and thus each entry is addressed by an 8-bit pixel value. Each color can be
chosen from a full palette, with a total of 16.7 million colors that is the each entry is of
24 bits and 8 bits per channel which sets the total combinations of 256 levels for each
of the red, green and blue components 256 x 256 x 256 =16,777,216 colors.

Image file formats

 GIF- Graphics Interchange Formats- The GIF format was created by


Compuserve. It supports 256 colors. GIF format is the most popular on the
Internet because of its compact size. It is ideal for small icons used for
navigational purpose and simple diagrams. GIF creates a table of up to 256
colors from a pool of 16 million. If the image has less than 256 colors, GIF can
easily render the image without any loss of quality. When the image contains
more colors, GIF uses algorithms to match the colors of the image with the
palette of optimum set of 256 colors available. Better algorithms search the
image to find and the optimum set of 256 colors.
Thus GIF format is lossless only for the image with 256 colors or less. In case of
a rich, true color image GIF may lose 99.998% of the colors. GIF files can be
saved with a maximum of 256 colors. This makes it is a poor format for
photographic images.
GIFs can be animated, which is another reason they became so successful.
Most animated banner ads are GIFs. GIFs allow single bit transparency that is
when you are creating your image, you can specify which color is to be
transparent. This provision allows the background colors of the web page to be
shown through the image.
 JPEG- Joint Photographic Experts Group- The JPEG format was developed
by the Joint Photographic Experts Group. JPEG files are bitmapped images. It
store information as 24-bit color. This is the format of choice for nearly all
photograph images on the internet. Digital cameras save images in a JPEG
format by default. It has become the main graphics file format for the World
Wide Web and any browser can support it without plug-ins. In order to make the
file small, JPEG uses lossy compression. It works well on photographs, artwork
and similar materials but not so well on lettering, simple cartoons or line
drawings. JPEG images work much better than GIFs. Though JPEG can be
interlaced, still this format lacks many of the other special abilities of GIFs, like
animations and transparency, but they really are only for photos.
 PNG- Portable Network Graphics- PNG is the only lossless format that web
browsers support. PNG supports 8 bit, 24 bits, 32 bits and 48 bits data types.
One version of the format PNG-8 is similar to the GIF format. But PNG is the
superior to the GIF. It produces smaller files and with more options for colors. It
supports partial transparency also. PNG-24 is another flavor of PNG, with 24-bit
color supports, allowing ranges of color akin to high color JPEG. PNG-24 is in
no way a replacement format for JPEG because it is a lossless compression
format. This means that file size can be rather big against a comparable JPEG.
Also PNG supports for up to 48 bits of color information.
 TIFF- Tagged Image File Format- The TIFF format was developed by the Aldus
Corporation in the 1980 and was later supported by Microsoft. TIFF file format is
widely used bitmapped file format. It is supported by many image editing
applications, software used by scanners and photo retouching programs.
TIFF can store many different types of image ranging from 1 bit image,
grayscale image, 8 bit color image, 24 bit RGB image etc. TIFF files originally
use lossless compression. Today TIFF files also use lossy compression
according to the requirement. Therefore, it is a very flexible format. This file
format is suitable when the output is printed. Multi-page documents can be
stored as a single TIFF file and that is way this file format is so popular. The
TIFF format is now used and controlled by Adobe.
 BMP- Bitmap- The bitmap file format (BMP) is a very basic format supported by
most Windows applications. BMP can store many different type of image: 1 bit
image, grayscale image, 8 bit color image, 24 bit RGB image etc. BMP files are
uncompressed. Therefore, these are not suitable for the internet. BMP files can
be compressed using lossless data compression algorithms.
 EPS- Encapsulated Postscript- The EPS format is a vector based graphic.
EPS is popular for saving image files because it can be imported into nearly any
kind of application. This file format is suitable for printed documents. Main
disadvantage of this format is that it requires more storage as compare to other
formats.
 PDF- Portable Document Format- PDF format is vector graphics with
embedded pixel graphics with many compression options. When your document
is ready to be shared with others or for publication. This is only format that is
platform independent. If you have Adobe Acrobat you can print from any
document to a PDF file. From illustrator you can save as .PDF.
 EXIF- Exchange Image File- Exif is an image format for digital cameras. A
variety of tage are available to facilitate higher quality printing, since information
about the camera and picture - taking condition can be stored and used by
printers for possible color correction algorithms.it also includes specification of
file format for audio that accompanies digital images.
 WMF- Windows MetaFile- WMF is the vector file format for the MS-Windows
operating environment. It consists of a collection of graphics device interface
function calls to the MS-Windows graphice drawing library.Metafiles are both
small and flexible, hese images can be displayed properly by their proprietary
softwares only.
 PICT- PICT images are useful in Macintosh software development, but you
should avoid them in desktop publishing. Avoid using PICT format in electronic
publishing-PICT images are prone to corruption.
 Photoshop- This is the native Photoshop file format created by Adobe. You can
import this format directly into most desktop publishing applications.

 Sources of Videos
TYPES OF VIDEO SIGNALS - MULTIMEDIA
Video signals can be organized in three different ways: Component video, Composite
video, and S - video.
Component Video
Component video is a video signal that has been split into two or more component
channels. In popular use, it refers to a type of component analog video (CAV)
information that is transmitted or stored as three separate signals. Component video
can be contrasted with composite video (NTSC, PAL or SECAM) in which all the video
information is combined into a single line - level signal that is used in analog television.
Like composite, component - video cables do not carry audio and are often paired with
audio cables.
When used without any other qualifications the term component video generally refers
to analog YPBPR component video with sync on luma.
Composite Video
Composite video (1 channel) is an analog video transmission (no audio) that carries
standard definition video typically at 480i or 576i resolution. Video information is
encoded on one channel in contrast with slightly higher quality S - video (2 channel),
and even higher quality component video (3 channels).
Composite video is usually in standard formats such as NTSC, PAL, and SECAM and is
often designated by the CVBS initialism, meaning "Color, Video, Blanking, and Sync."
S - Video
Separate Video (2 channel), more commonly known as S - Video and Y/C, is an
analog video transmission (no audio) that carries standard definition video typically at
480i or 576i resolution. Video information is encoded on two channels: luma (luminance,
intensity, "Y") and chroma (colour, "C"). This separation is in contrast with slightly lower
quality composite video (1 channel) and higher quality component video (3 channels).
It's often referred to by JVC (who introduced the DIN - connector pictured) as both an S
- VHS connector and as Super Video.
The four - pin mini - DIN connector (shown at right) is the most common of several S -
Video connector types. Other connector variants include seven - pin locking "dub"
connectors used on many professional S - VHS machines, and dual "Y" and "C" BNC
connectors, often used for S - Video patch panels. Early Y/C video monitors often used
phono (RCA connector) that were switchable between Y/C and composite video input.
Though the connectors are different, the Y/C signals for all types are compatible.

Analog video is a video signal transferred by an analog signal. When combined in to


one channel, it is called composite video as is the case, among others with NTSC, PAL
and SECAM.
Analog video may be carried in separate channels, as in two channel S - Video (YC)
and multi - channel component video formats.
Analog video is used in both consumer and professional television production
applications. However, digital video signal formats with higher quality have been
adopted, including serial digital interface (SDI), Firewire (IEEE 1394), Digital Visual
Interface (DVI) and High - Definition Multimedia Interface (HDMI).
Most TV is still sent and received as an analog signal. Once the electrical signal is
received, we may assume that brightness is at least a monotonic function of voltage, if
not necessarily linear, because of gamma correction.
An analog signal f(t) samples a time - varying image. So - called progressive scanning
traces through a complete picture (a frame) row - wise for each time interval. A high -
resolution computer monitor typically uses a time interval of 1/72 second.
In TV and in some monitors and multimedia standards, another
system, interlaced scanning, is used. Here, the odd - numbered lines are traced first,
then the even - numbered lines. This results in "odd" and "even" fields — two fields
make up one frame.
In fact, the odd lines (starting from 1) end up at the middle of a line at the end of the odd
field, and the even scan starts at a half - way point. The following figure shows the
scheme used. First the solid (odd) lines are traced— P to Q, then R to S, and so on,
ending at T — then the even field starts at U and ends at V. The scan fines are not
horizontal because a small voltage is applied, moving the electron beam down over
time.
Interlaced raster scan
Interlacing was invented because, when standards were being defined, it was difficult to
transmit the amount of information in a full frame quickly enough to avoid flicker. The
double number of fields presented to the eye reduces perceived flicker.
Because of interlacing, the odd and even lines are displaced in time from each other.
This is generally not noticeable except when fast action is taking place onscreen, when
blurring may occur. For example, in the video in the following figure, the moving
helicopter is blurred more than the still background.
Since it is sometimes necessary to change the frame rate, resize, or even produce stills
from an interlaced source video, various schemes are used to de - interlace it. The
simplest de - interlacing method consists of discarding one field and duplicating the
scan lines of the other field, which results in the information in one field being lost
completely. Other, more complicated methods retain information from both fields.
CRT displays are built like fluorescent lights and must flash 50 to 70 times per second
to appear smooth. In Europe, this fact is conveniently tied to their 50 Hz electrical
system, and they use video digitized at 25 frames per second (fps); in North America,
the 60 Hz electric system dictates 30 fps.
The jump from Q to R and so on is called the horizontal retrace, during which the
electronic beam in the CRT is blanked. The jump from T to U or V to P is called
the vertical retrace.
Interlaced scan produces two fields for each frame:(a) the video frame; (b) Field
1; (c) Field 2; (d) difference of Fields

Since voltage is one - dimensional — it is simply a signal that varies with time — how do
we know when a new video line begins? That is, what part of an electrical signal tells us
that we have to restart at the left side of the screen?
The solution used in analog video is a small voltage offset from zero to indicate black
and another value, such as zero, to indicate the start of a line. Namely, we could use a
"blacker - than - black" zero signal to indicate the beginning of a line.
The following figure shows a typical electronic signal for one scan line of NTSC
composite video. 'White' has a peak value of 0.714 V; 'Black' is slightly above zero at
0.055 V; whereas
Blank is at zero volts. As shown, the time duration for blanking pulses in the signal is
used for synchronization as well, with the tip of the Sync signal at approximately —
0.286 V. In fact, the problem of reliable synchronization is so important that special
signals to control sync take up about 30% of the signal!
Electronic signal for one NTSC scan line

The vertical retrace and sync ideas are similar to the horizontal one, except that they
happen only once per field.
NTSC Video
NTSC, named for the National Television System Committee, is the analog television
system that is used in most of North America, parts of South America (except Brazil,
Argentina, Uruguay, and French Guiana), Myanmar, South Korea, Taiwan, Japan, the
Philippines, and some Pacific island nations and territories.
Most countries using the NTSC standard, as well as those using other analog television
standards, are switching to newer digital television standards, of which at least four
different ones are in use around the world. North America, parts of Central America, and
South Korea are adopting the ATSC standards, while other countries are adopting or
have adopted other standards.
The first NTSC standard was developed in 1941 and had no provision for color
television. In 1953 a second modified version of the NTSC standard was adopted, which
allowed color television broadcasting compatible with the existing stock of black - and -
white receivers. NTSC was the first widely adopted broadcast color system and
remained dominant where it had been adopted until the first decade of the 21st century,
when it was replaced with digital ATSC. After nearly 70 years of use, the vast majority of
over - the - air NTSC transmissions in the United States were turned off on June 12,
2009 and August 31, 2011 in Canada and most other NTSC markets.
Digital broadcasting permits higher - resolution television, but digital standard definition
television in these countries continues to use the frame rate and number of lines of
resolution established by the analog NTSC standard; systems using the NTSC frame
rate and resolution (such as DVDs) are still referred to informally as "NTSC". NTSC
baseband video signals are also still often used in video playback (typically of
recordings from existing libraries using existing equipment) and in CCTV and
surveillance video systems.
Video raster, including retrace and sync data
Samples per line for various analog video formats

Different video formats provide different numbers of samples per line, as listed in the
above table. Laser disks have about the same resolution as Hi - 8. (In comparison, mini
DV 1/4 - inch tapes for digital video are 480 lines by 720 samples per line.)
Interleaving Y and C signals in the NTSC spectrum
PAL Video
PAL (Phase Alternating Line) is a TV standard originally invented by German scientists.
It uses 625 scan lines per frame, at 25 frames per second (or 40 msec / frame), with a 4
: 3 aspect ratio and interlaced fields. Its broadcast TV signals are also used in
composite video. This important standard is widely used in Western Europe, China,
India and many other parts of the world.

PAL uses the YUV color model with an 8 MHz channel, allocating a bandwidth of 5.5
MHz to Y and 1.8 MHz each to U and V. The color subcarrier frequency is fsc ≈ 4.43
MHz. To improve picture quality, chroma signals have alternate signs (e.g., +U and —
U) in successive scan lines; hence the name "Phase Alternating Line. This facilitates the
use of a (line - rate) comb filter at the receiver — the signals in consecutive lines are
averaged so as to cancel the chroma signals (which always carry opposite signs) for
separating Y and C and obtain high - quality Y signals.
SECAM Video

SECAM, which was invented by the French, is the third major broadcast TV standard.
SECAM stands for Systeme Electronique Couleur Avec Memorie. SECAM also uses
625 scan lines per frame, at 25 frames per second, with a 4:3 aspect ratio and
interlaced fields. The original design called for a higher number of scan lines (over 800),
but the final version settled for 625.
SECAM and PAL are similar, differeing slightly in their color coding scheme. In SECAM,
U and V signals are modulated using separate color subcarriers at 4.25 MHz and 4.41
MHz, respectively. They are sent in alternate lines - that is, only one of the U or V
signals will be sent on each scan line.
Table Comparison of the analog broadcast TV systems.

DIGITAL VIDEO
Digital video comprises a series of orthogonal bitmap digital images displayed in rapid
succession at a constant rate. In the context of video these images are called frames.
We measure the rate at which frames are displayed in frames per second (FPS).
Since every frame is an orthogonal bitmap digital image it comprises a raster of pixels. If
it has a width of W pixels and a height of Hpixels we say that the frame size is WxH.
Pixels have only one property, their color. The color of a pixel is represented by a fixed
number of bits. The more bits the more subtle variations of colors can be reproduced.
This is called the color depth (CD) of the video.
An example video can have a duration (T) of 1 hour (3600sec), a frame size of 640 x
480 (W x H) at a color depth of 24bits and a frame rate of 25fps. This example video
has the following properties:
 pixels per frame = 640 * 480 = 307,200
 bits per frame = 307,200 * 24 = 7,372,800 = 7.37Mbits
 bit rate (BR) = 7.37 * 25 = 184.25Mbits / sec
 video size (VS) = 184Mbits / sec * 3600sec = 662,400Mbits = 82,800Mbytes =
82.8Gbytes
The advantages of digital representation for video are many. It permits
 Storing video on digital devices or in memory, ready to be processed (noise
removal, cut and paste, and so on) and integrated into various multimedia
applications
 Direct access, which makes nonlinear video editing simple Repeated recording
without degradation of image quality
 Ease of encryption and better tolerance to channel noise
Table Comparison of analog broadcast TV systems

In earlier Sony or Panasonic recorders, digital video was in the form of composite video.
Modem digital video generally uses component video, although RGB signals are first
converted into a certain type of color opponent space, such as YUV. The usual color
space is YCbCr.
Chroma Subsampling
Since humans see color with much less spatial resolution than black and white, it makes
sense to decimate the chrominance signal. Interesting but not necessarily informative
names have arisen to label the different schemes used. To begin with, numbers are
given stating how many pixel values, per four original pixels, are actually sent. Thus the
chroma subsampling scheme "4:4:4" indicates that no chroma subsampling is used.
Each pixel's Y, Cb, and Cr values are transmitted, four for each of Y, Cb, and Cr.
The scheme "4:2:2" indicates horizontal subsampling of the Cb and Cr signals by a
factor of 2. That is, of four pixels horizontally labeled 0 to 3, all four 7s are sent, and
every two Cbs and two Crs are sent, as {CbO, Y0)(Cr0, Yl)(Cb2, Y2)(Cr2, Y3)(Cb4,
Y4), and so on.
The scheme "4:1:1" subsamples horizontally by a factor of 4. The scheme "4:2:0"
subsamples in both the horizontal and vertical dimensions by a factor of 2.
Theoretically, an average chroma pixel is positioned between the rows and columns, as
shown in the below figure. We can see that the scheme 4:2:0 is in fact another kind of
4:1:1 sampling, in the sense that we send 4, 1, and 1 values per 4 pixels. Therefore, the
labeling scheme is not a very reliable mnemonic!
Scheme 4:2:0, along with others, is commonly used in JPEG and MPEG.
CCIR Standards for Digital Video
The CCIR is the Consultative Committee for International Radio. One of the most
important standards it has produced is CCIR - 601, for component digital video. This
standard has since become standard ITU - R - 601, an international standard for
professional video applications. It is adopted by certain digital video formats, including
the popular DV video.
The NTSC version has 525 scan fines, each having 858 pixels (with 720 of them visible,
not in the blanking period).Because the NTSC version uses 4:2:2, each pixel can be
Chroma subsampling
represented with two bytes (8 bits for Y and 8 bits alternating between Cb and Cr). The
CCIR 601. (NTSC) data rate (including blanking and sync but excluding audio) is thus
approximately 216 Mbps (megabits per second):
525 x 858 x 30 x 2 bytes x 8 bits / byte≈ 216 Mbps
During blanking, digital video systems may make use of the extra data capacity to carry
audio signals, translations into foreign languages, or error - correction information.
The following table shows some of the digital video specifications, all with an aspect
ratio of 4:3. The CCIR 601 standard uses an interlaced scan, so each field has only half
as much vertical resolution (e.g., 240 lines in NTSC).
Table Digital video specifications
CIF stands for Common Intermediate Format, specified by the International Telegraph
and Telephone Consultative Committee (CCITT), now superseded by the International
Telecommunication Union, which oversees both telecommunications (ITU - T) and radio
frequency matters (ITU - R) under one United Nations body. The idea of CIF, which is
about the same as VHS quality, is to specify a format for lower bitrate. CIF uses a
progressive (noninterlaced) scan. QCIF stands for Quarter - CIF, and is for even lower
bitrate. All the CIF / QCIF resolutions are evenly divisible by 8, and all except 88 are
divisible by 16; this is convenient for block - based video coding in H.261 and H.263.
CIF is a compromise between NTSC and PAL, in that it adopts the NTSC frame rate
and half the number of active lines in PAL. When played on existing TV sets, NTSC TV
will first need to convert the number of lines, whereas PAL TV will require frame-rate
conversion.
High Definition TV (HDTV)
The introduction of wide - screen movies brought the discovery that viewers seated near
the screen enjoyed a level of participation (sensation of immersion) not experienced
with conventional movies. Apparently the exposure to a greater field of view, especially
the involvement of peripheral vision, contributes to the sense of "being there". The main
thrust of High Definition TV (HDTV) is not to increase the "definition" in each unit area,
but rather to increase the visual field, especially its width.
First - generation HDTV was based on an analog technology developed by Sony and
NHK in Japan in the late 1970s. HDTV successfully broadcast the 1984 Los Angeles
Olympic Games in Japan. Multiple sub - Nyquist Sampling Encoding (MUSE) was an
improved NHK HDTV with hybrid analog / digital technologies that was put in use in the
1990s. It has 1,125 scan lines, interlaced (60 fields per second), and a 16:9 aspect
ratio. It uses satellite to broadcast — quite appropriate for Japan, which can be covered
with one or two satellites.
The Direct Broadcast Satellite (DBS) channels used have a bandwidth of 24 MHz. In
general, terrestrial broadcast, satellite broadcast, cable, and broadband networks are all
feasible means for transmitting HDTV as well as conventional TV. Since uncompressed
Table Advanced Digital TV Formats Supported by ATSC
HDTV will easily demand more than 20 MHz bandwidth, which will not fit in the current 6
MHz or 8 MHz channels, various compression techniques are being investigated. It is
also anticipated that high - quality HDTV signals will be transmitted using more than one
channel, even after compression.
In 1987, the FCC decided that HDTV standards must be compatible with the existing
NTSC standard and must be confined to the existing Very High Frequency (VHF) and
Ultra High Frequency (UHF) bands. This prompted a number of proposals in North
America by the end of 1988, all of them analog or mixed analog / digital.
In 1990, the FCC announced a different initiative — its preference for full - resolution
HDTV. They decided that HDTV would be simultaneously broadcast with existing NTSC
TV and eventually replace it. The development of digital HDTV immediately took off in
North America.
Witnessing a boom of proposals for digital HDTV, the FCC made a key decision to go
all digital in 1993. A "grand alliance" was formed that included four main proposals, by
General Instruments, MIT, Zenith, and AT&T, and by Thomson, Philips, Sarnoff and
others. This eventually led to the formation of the Advanced Television Systems
Committee (ATSC), which was responsible for the standard for TV broadcasting of
HDTV. In 1995, the U.S. FCC Advisory Committee on Advanced Television Service
recommended that the ATSC digital television standard be adopted.
The standard supports video scanning formats shown in Table. In the table, "I" means
interlaced scan and "P" means progressive (noninterlaced) scan. The frame rates
supported are both integer rates and the NTSC rates — that is, 60.00 or 59.94, 30.00 or
29.97, 24.00 or 23.98 fps.
For video, MPEG - 2 is chosen as the compression standard. As will be seen in
Chapter, it uses Main Level to High Level of the Main Profile of MPEG - 2. For audio,
AC - 3 is the standard. It supports the so - called 5.1 channel Dolby surround sound —
five surround channels plus a subwoofer channel.
The salient difference between conventional TV and HDTV [4, 6] is that the latter has a
much wider aspect ratio of 16:9 instead of 4:3. (Actually, it works out to be exactly one -
third wider than current TV) Another feature of HDTV is its move toward progressive
(noninterlaced) scan. The rationale is that interlacing introduces serrated edges to
moving objects and flickers along horizontal edges.
The FCC has planned to replace all analog broadcast services with digital TV
broadcasting by the year 2006. Consumers with analog TV sets will still be able to
receive signals via an 8 - VSB (8 - level vestigial sideband) demodulation box. The
services provided will include
 Standard Definition TV (SDTV)— the current NTSC TV or higher
 Enhanced Definition TV (EDTV) — 480 active lines or higher — the third and
fourth rows
 High Definition TV (HDTV)— 720 active lines or higher. So far, the popular
choices are 720P (720 lines, progressive, 30 fps) and 10801 (1,080 lines,
interlaced, 30 fps or 60 fields per second). The latter provides slightly better
picture quality but requires much higher bandwidth.

 Animation, video and digital movies editing tools


(Computer Hardware for Animation and Multimedia)
Animations are graphic scenes played back sequentially and rapidly. These tools adopt
an object-oriented approach to animation. These tools enable you to edit and assemble
video clips captured from camera, animations and other sources. The completed clip
with added transition and visual effects could be played back. Adobe Premiere and
Media Shop Pro are two good examples of these tools.
Integrated Design Software Multimedia authoring tools are tools which organize and edit
your multimedia project. These tools are required to design the user interface for
presenting the project to the learner. In other words, these tools are used to assemble
various elements to make a single presentation. You can compose comprehensive
videos and animations with these tools. There are four basic type of authoring tools viz.
(like Tool book, Visual Basic), (like Authorware), (like Macromedia Director) and object
oriented tools (like Media Forge).

Page-based tools organize elements as pages of a book. These tools are used when
the content of the project consists of elements that can be viewed individually. These
tools organize them in a user-defined sequential form. organize elements as objects.
These tools display the flow diagrams of activities along with branching paths. organize
the elements along a time-line. These tools play back the sequentially organized
graphic frames at user-set speed and time. organize the elements in a hierarchical
order as related “objects”. These tools make these objects perform according to
properties assigned to them. We will give here a brief description of two such tools
Authorware (Icon based) and Macromedia Director (Time based).

Macromedia Authorware has a visual interface, which one has to simply drag and drop
icons to create an application. You do not need to be a programmer to use this software
as it has an interactive design. Authorware provides direct support for graphics and
animations made in Flash. Authorware can capture and integrate animations and video
made in different programmes like Flash and QuickTime. It can integrate sound into
your project in order to enhance the effect. It has an antialiasing feature which
smoothest out the edges of text and graphics. Authorware has built-in templates which
give you flexibility and convenience while developing your project. You can learn about
basic authoring, editing and publishing ways with the help of a multimedia tutorial which
is built-in with this software.
Macromedia Director is a multimedia authoring application capable of producing
animations, presentations and movies. It provides a wide range of possibilities for
integrating different multimedia elements. It supports inputs from programs like
shockwave, photoshop and premier. It has applications in building professional Page
based tools Icon based authoring tools Time based authoring tools Object Oriented
tools.
multimedia presentations. You can also integrate Real Audio and Real Video in Director
projects. Compatibility of Director with other packages means that you can use your
favorite tools and software to create content for your project and then bring that content
into Director for authoring.

Choosing Multimedia software


Multimedia is making a difference by providing ways of delivering learning materials that
are less expensive and more convenient. The key to any learning process is that it must
be relevant and it must keep the learner engaged. Educational multimedia is no
exception. This can be proved after seeing the growing use of graphics, illustrations,
animations and sound in educational multimedia. It is therefore essential to choose that
software which enables you to execute your project with the minimum possible effort
and maximum possible productivity. Multimedia software have unlimited features. You
can choose among several hundred colors, dozens of fonts, a wide variety of color-
coordinated templates and many other incredible options. Before starting to select
software, one should start with an outline of the project and decide what is expected
from the project. Table-2 gives a ready reckoner for selecting software. Hardware is the
first thing that you should have to begin your quest with a multimedia project. Hardware
is necessary to interpret your commands, queries and responses into computer activity.
You have read about hardware components viz. system devices, memory and storage
devices, input devices, output devices and communication devices. Fortunately there is
an abundance of good hardware answers to almost every problem. These areas are
fast getting converged. May be tomorrow you would be able to see some more
innovative steps in this direction which offers you even better c Similarly in software too,
entire suites of integrated production tools are now available. The need is to use them
judiciously to create good projects. Powerful features are continuously being added to
the software that allows developers to work more smoothly and conveniently between
applications. Emergence of these integration features has resulted in collaboration and
unison of multiple tools. The integration has enabled us to use your graphics from a
previous work and save time on rebuilding it. In short, the options available are
enormous. All that you have to do is to choose the right hardware and software to
complete your multimedia projects. In the next section we will discuss about learner
characteristics in order to develop good programme for them.

You might also like