0% found this document useful (0 votes)
38 views216 pages

Unit 1 To 5

Uploaded by

testorretest
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views216 pages

Unit 1 To 5

Uploaded by

testorretest
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 216

UNIT- I

1.1 Multimedia
Multimedia is an interactive media and provides
multiple ways to represent information to the user in a
powerful manner. It provides an interaction between
users and digital information.
It is a medium of communication .Some of the sectors
where multimedias is used extensively are education,
training, reference material, business presentations,
advertising and documentaries.
Multimedia is a form of communication that combines
different content forms such as text, audio, images,
animations, or video into a single interactive .

Fig. 1.1 - Multimedia


1.2 Components / Elements of Multimedia:

Fig. 1.2 Elements of Multimedia


The following are typical multimedia elements:
1) Text - Text appears in all multi-media projects to
some extent. To match the successful
presentation of the multimedia program, the text
may be presented in a variety of font styles and
sizes.

Fig. 1.3. Text


2) Graphics - The multimedia program is
appealing because of its graphics. People
frequently find it difficult to read long passages
of text on screens. As a result, visuals are
frequently utilized instead of writing to convey
ideas, give context, etc. Graphics can be of two
different types:
o Bitmap - Bitmap images are authentic pictures
that can be taken using tools like digital cameras
or scanners. Bitmap pictures are often not
modifiable. Memory use for bitmap pictures is
high.
o Vector Graphics - Computers can draw vector
graphics because they just need a little amount
of memory. These images can be changed.

Fig.1.4. Graphiics
3) Animation - A static picture can be animated to
appear to be in motion. A continuous succession
of static images shown in order is all that makes
up an animation. Effective attention-getting may
be achieved by the animation. Additionally,
animation adds levity and appeal to a
presentation. In multimedia applications, the
animation is fairly common.

Fig.1.5 Animation
4) Audio - Speech, music, and sound effects could
all be necessary for a multimedia application.
They are referred to as the audio or sound
component of multimedia. Speaking is a fantastic
educational tool. Analog and digital audio are
both kinds. The initial sound signal is referred to
as analog audio or sound. Digital sound is saved
on a computer. Digital audio is therefore utilized
for sound in multimedia applications.

Fig.1.6 Audio
5) Video - The term "video" describes a moving
image that is supported by sound, such as a
television image. A multimedia application's
video component conveys a lot of information
quickly. For displaying real-world items in
multimedia applications, digital video is helpful.
If uploaded to the internet, the video really does
have the highest performance requirements for
computer memory and bandwidth. The quality of
digital video files may still be preserved while
being saved on a computer, similarly to other
data. A computer network allows for the
transport of digital video files. The digital video
snippets are simple to modify.

Fig.1.7 Video

1.3 Applications of Multimedia


The typical areas where multimedia is applied are
listed below.
1) For entertainment purposes - Multimedia
marketing may significantly improve the promotion of
new items. Both advertising and marketing staff had
their doors opened by the economical communication
boost provided by multimedia. Flying banner
presentations, video transitions, animations, and audio
effects are just a few of the components utilized to
create a multimedia-based advertisement that appeals
to the customer in a brand-new way and encourages the
purchase of the goods.

2) For education purposes - There are currently a lot


of educational computer games accessible. Take a look
at an illustration of an educational app that plays
children's rhymes. In addition to merely repeating
rhymes, the youngster may create drawings, scale
items up or down, and more. There are many more
multimedia products on the market that provide
children with a wealth of in-depth knowledge and
playing options.
3) For business purposes - There are several
commercial uses for multimedia. Multimedia and
communication technologies have made it possible for
information from international work groups. Today's
team members can work remotely and for a variety of
businesses. A global workplace will result from this.
The following facilities should be supported by the
multimedia network:
o Office needs
o Records management
o Employee training
o Electronic mail
o Voice mail

4) For marketing purposes - Multimedia marketing


may significantly improve the promotion of new items.
Both advertising and promotion staff had their doors
opened by the economical communication boost
provided by multimedia. Flying banner presentations,
video transitions, animations, and audio effects are just
a few of the components utilized to create a multimedia-
based advertisement that appeals to the customer in a
brand-new way and encourages the purchase of the
goods.

5) For banking purposes - Another public setting


where multimedia is being used more and more
recently is banks. People visit banks to open savings
and current accounts, make deposits and withdrawals,
learn about the bank's various financial plans, apply for
loans, and other things. Each bank wants to notify its
consumers with a wealth of information. It can employ
multimedia in a variety of ways to do this. The bank also
has a PC monitor in the clients' rest area that shows
details about its numerous programs. Online and
internet banking have grown in popularity recently.
These heavily rely on multimedia. As a result, banks are
using multimedia to better serve their clients and
inform them of their appealing financing options.

1.3.1 Multimedia in Education


Fig.1.8 Multimedia in Education
Teachers use multimedia to supplement traditional
lectures and practical demonstrations in the classroom.
Multimedia provides the benefit of interactive learning.
Multimedia differs from television in this sense.

Multimedia is utilized in education to create popular


reference books like encyclopedias and guidebooks as
well as computer-based training courses (often
referred to as CBTs). Text, pictures, music, and
animation are all used in CBTs.

1.3.2 Entertainment with Multimedia


Fig.1.9 Entertainment with Multimedia
One of the biggest multimedia industries, the
entertainment business has grown as a result of
technological advancements in the creation of games,
movies, ads, and other forms of entertainment. The
entertainment sector makes extensive use of
multimedia, particularly to create special effects for
films and cartoons. Avatar, Avengers, and The Jungle
book are a few examples.

A common hobby of many people is playing


multimedia games, and the same is true of software
that can be downloaded online or through CD-ROMs.
Multimedia aspects are also used in some video games.
Interactive multimedia refers to multimedia
programs that let users take part actively rather than
merely passively consuming information.

1.3.3 Multimedia in the Business

Fig1.10Multimedia in the Business


In business, multimedia is an extremely powerful
presentation and sales tool. Conferences, training,
advertising, promotion, product demos, modeling,
databases, portfolios, text messaging, network
communications, voicemail messages, and
teleconferencing are just a few examples of business
uses for multimedia.

Nowadays, there is a large industry devoted to internet


marketing, hosting websites, and website coding.
Using several technologies including email, Texting,
MMS, teleconferences, and chats, among others, the
company uses it to grow its customer.

With the help of multimedia business, even simple


office programs like a word processor or spreadsheet
software become effective tools. To emphasize key
points in the documents, images, animation, and sound
can be added to these applications.

1.3.4 Research Using Multimedia

Fig.1.11Research Using Multimedia


The science of evaluating information that includes
audio, video, text, and other modalities can be referred
to as multimedia analytics. As we've seen,
illustrations from textbooks fail to accurately depict
complex features or multi-step processes.
Many scientific fields, including health, geology,
engineering, and surface analysis, use multimedia
modeling of research processes.

1.3.5 Multimedia in Training

Fig.1.12 Multimedia in Training


To propose fresh ideas or explain cutting-edge
technology, multimedia demonstrations are excellent.
This helps cut down on training and design time for
businesses. People easily comprehend and utilize
multimedia. The largest and most established
application of multimedia technology has been
corporate training. It is particularly appropriate when
complicated, frequently modified, and modified
processes, goods, and services are involved.
1.3.6 Uses of Multimedia in Different Places
Nowadays, multimedia has become a very important
part of every company carrying out their work. The
following are the important applications of multimedia:
• Creative Industries
• Commerce
• Entertainment
• Education
• Educational Technology
• Social Work Purposes
• Communication
• Journalism
• Engineering
• Medicine
• Research
• Interior Designing
Examples of multimedia learning include watching a
PowerPoint presentation, watching a pre-recorded
lecture, reading a physics textbook, video podcasts,
audio slideshows and animated video, presentations,
training, marketing, advertising, product demos,
catalogues, networked communication and voicemail.
1.4 Multimedia Hardware
Most of the computers now-a-days come equipped with
the hardware components required to develop/view
multimedia applications. Following are the various
categories in which we can define the various types of
hardwares required for multimedia applications.
• Processor- The heart of any multimedia
computer is its processor. Today Core 15 or
higher processor is recommended for a
multimedia computer.
• CPU is considered as the brain of the computer.
• CPU performs all types of data processing
operations.
• It stores data, intermediate result and
instructions (program).
• It controls the operations of all parts of computer.

Fig.1.13 Processor
Memory and Storage Devices - You need memory for
storing various files used during production, original
audio and video clips, edited pieces and final mined
pieces. You also need memory for backup of your
project files.

Primary Memory- Primary memory holds only those


data and instructions on which computer is currently
working. It has limited capacity and data gets lost when
power is switched off. It is generally made up of
semiconductor device. These memories are not as fast
as registers. The data and instructions required to be
processed earlier reside in main memory. It is divided
into two subcategories RAM and ROM.

Fig.1.14 Primary Memory


Flash Memory- Cache memory is a very high speed
semiconductor memory, which can speed up CPU. It
acts as a buffer between the CPU and main memory. It
is used to hold those parts of data and program which
are most frequently used by CPU. The parts of data and
programs are transferred from disk to cache memory
by operating system, from where CPU can access them.

Secondary Memory: This type of memory is also


known as external memory or non-volatile. It is slower
than main memory. These are used for storing
Data/Information permanently. CPU directly does not
access these memories; instead they are accessed via
input-output routines. Contents of secondary memories
are first transferred to main memory and then CPU can
access it. For example, disk, CD-ROM, DVD, etc.

Fig.1.15 Secondary Memory


Input Devices - Following are the various types of
input devices which are used in multimedia systems.
Keyboard- Most common and very popular input
device is keyboard. The keyboard helps in inputting
the data to the computer. The layout of the keyboard is
like that of traditional typewriter, although there are
some additional keys provided for performing some
additional functions. Keyboards are of two sizes 84 keys
or 101/102 keys, but now 104 keys or 108 keys
keyboard is also available for Windows and Internet.
The keys are following:

S.No Keys Description

These keys include the letter


keys (A-Z) and digits keys
1 Typing Keys (0-9) which generally give
same layout as that of
typewriters.
It is used to enter numeric data or
Numeric
2 cursor movement.
Keypad
Generally, it consists of a
set of 17 keys that are laid
out in the same
configuration used by most
adding machine and
calculators.
The twelve functions keys are
present on the keyboard.
These are arranged in a
row along the top of the
3 Function Keys
keyboard. Each function
key has unique meaning
and is used for some
specific purpose.
These keys provide cursor and
screen control. It includes
four directional arrow key.
Control keys also include
4 Control keys Home, End, Insert, Delete,
Page Up, Page Down,
Control(Ctrl),
Alternate(Alt),
Escape(Esc).
Keyboard also contains some
Special special purpose keys such
5 Purpos as Enter, Shift, Caps Lock,
e Keys Num Lock, Space bar, Tab,
and Print Screen.

Mouse - Mouse is most popular Pointing device. It is a


very famous cursor-control device. It is a small palm
size box with a round ball at its base which senses the
movement of mouse and sends corresponding signals
to CPU on pressing the buttons. Generally, it has two
buttons called left and right button and scroll bar is
present at the mid. Mouse can be used to control the
position of cursor on screen, but it cannot be used to
enter text into the computer.

Fig.1.16 mouse & Keyboard


Joystick - Joystick is also a pointing device, which is
used to move cursor position on a monitor screen. It is
a stick having a spherical ball at its both lower and
upper ends. The lower spherical ball moves in a socket.
The joystick can be moved in all four directions. The
function of joystick is similar to that of a mouse. It is
mainly used in Computer Aided Designing (CAD) and
playing computer games.

Fig.1.17 Joystick
Light Pen - Light pen is a pointing device, which is
similar to a pen. It is used to select a displayed menu
item or draw pictures on the monitor screen. It consists
of a photocell and an optical system placed in a small
tube. When light pen's tip is moved over the monitor
screen and pen button is pressed, its photocell sensing
element detects the screen location and sends the
corresponding signal to the CPU.
Fig.1.18 Lightpen

Track Ball - Track ball is an input device that is mostly


used in notebook or laptop computer, instead of a
mouse. This is a ball, which is half inserted and by
moving fingers on ball, pointer can be moved.Since the
whole device is not moved, a track ball requires less
space than a mouse. A track ball comes in various
shapes like a ball, a button and a square.

Fig.1.19 Trackball

Scanner - Scanner is an input device, which works


more like a photocopy machine. It is used when some
information is available on a paper and it is to be
transferred to the hard disc of the computer for further
manipulation. Scanner captures images from the source
which are then converted into the digital form that can
be stored on the disc. These images can be edited
before they are printed.

Fig.1.20 Scanner

Digitizer - Digitizer is an input device, which converts


analog information into a digital form. Digitizer can
convert a signal from the television camera into a series
of numbers that could be stored in a computer. They
can be used by the computer to create a picture of
whatever the camera had been pointed at. Digitizer is
also known as Tablet or Graphics Tablet because it
converts graphics and pictorial data into binary inputs.
A graphic tablet as digitizer is used for doing fine works
of drawing and images manipulation applications.
Fig.1.21 Digitizer

Magnetic Ink Card Reader (MICR) - MICR input


device is generally used in banks because of a large
number of cheques to be processed everyday. The
bank's code number and cheque number are printed
on the cheques with a special type of ink that contains
particles of magnetic material that are machine
readable. This reading process is called Magnetic Ink
Character Recognition (MICR). The main advantage of
MICR is that it is fast and less error prone.

Fig.1.22 MICR
Optical Character Reader (OCR) - OCR is an input
device used to read a printed text. OCR scans text
optically character by character, converts them into a
machine readable code and stores the text on the
system memory.

Fig.1.23 OCR

Bar Code Readers - Bar Code Reader is a device used


for reading bar coded data (data in form of light and
dark lines). Bar coded data is generally used in
labelling goods, numbering the books, etc. It may be a
hand-held scanner or may be embedded in a stationary
scanner.Bar Code Reader scans a bar code image,
converts it into an alphanumeric value, which is then
fed to the computer to which bar code reader is
connected.
Fig.1.24 Barcode Reader

Optical Mark Reader (OMR) - OMR is a special type


of optical scanner used to recognize the type of mark
made by pen or pencil. It is used where one out of a few
alternatives is to be selected and marked. It is specially
used for checking the answer sheets of examinations
having multiple choice questions.

Fig.1.25 OMR

Voice Systems - Following are the various types of


input devices which are used in multimedia systems.
Microphone- Microphone is an input device to input
sound that is then stored in digital form. The
microphone is used for various applications like adding
sound to a multimedia presentation or for mixing music.

Fig.1.26 Microphone

Speaker- Speaker is an output device to produce sound


which is stored in digital form. The speaker is used for
various applications like adding sound to a multimedia
presentation or for movies displays etc.

Fig.1.27 Speaker
Digital Camera - Digital camera is an input device to
input images that is then stored in digital form. The
digital camera is used for various applications like
adding images to a multimedia presentation or for
personal purposes.

Fig.1.28 Digutal Camera

Digital Video Camera - Digital Video camera is an


input device to input images/video that is then stored
in digital form. The digital video camera is used for
various applications like adding videos to a multimedia
presentation or for personal purposes.
Fig.1.29 Digital Video camera

Output Devices - Following are few of the important


output devices, which are used in Computer Systems:

Monitors - Monitor commonly called as Visual Display


Unit (VDU) is the main output device of a computer. It
forms images from tiny dots, called pixels, that are
arranged in a rectangular form. The sharpness of the
image depends upon the number of the pixels. There
are two kinds of viewing screen used for monitors:
Cathode-Ray Tube (CRT) Monitor- In the CRT,
display is made up of small picture elements called
pixels for short. The smaller the pixels, the better the
image clarity or resolution. It takes more than one
illuminated pixel to form whole character, such as the
letter 'e' in the word help. A finite number of characters
can be displayed on a screen at once. The screen can
be divided into a series of character boxes - fixed
location on the screen where a standard character can
be placed. Most screens are capable of displaying 80
characters of data horizontally and 25 lines vertically.

Fig.1.30 CRT Monitor

Flat-Panel Display Monitor- The flat-panel display


refers to a class of video devices that have reduced
volume, weight and power requirement compared to
the CRT. You can hang them on walls or wear them on
your wrists. Current uses for flat-panel displays include
calculators, video games, monitors, laptop computer,
graphics display. The flat-panel displays are divided
into two categories:
Emissive Displays- The emissive displays are devices
that convert electrical energy into light. Examples are
plasma panel and LED (Light-Emitting Diodes).
Non-Emissive Displays- The Non-emissive displays
use optical effects to convert sunlight or light from some
other source into graphics patterns. Example is LCD
(Liquid-Crystal Device)

Fig.1.31 LED diaplay

1.4.1 Printers - Printer is the most important output


device, which is used to print information on paper.
Dot Matrix Printer- In the market, one of the most
popular printers is Dot Matrix Printer because of their
ease of printing features and economical price. Each
character printed is in form of pattern of Dot's and head
consists of a Matrix of Pins of size (5*7, 7*9, 9*7 or 9*9)
which comes out to form a character that is why it is
called Dot Matrix Printer.
Fig.1.32 Dot Matrix Printer

Daisy Wheel- Head is lying on a wheel and Pins


corresponding to characters are like petals of Daisy
(flower name) that is why it is called Daisy Wheel
Printer. These printers are generally used for word-
processing in offices which require a few letters to be
send here and there with very nice quality
representation.

Fig.1.33 Daisy wheel Printer

Line Printers- Line printers are printers, which print


one line at a time.
Fig.1.34 Line Printer
Laser Printers- These are non-impact page printers.
They use laser lights to produce the dots needed to
form the characters to be printed on a page.

Fig.1.35 Laser Printer


Inkjet Printers- Inkjet printers are non-impact
character printers based on a relatively new
technology. They print characters by spraying small
drops of ink onto paper. Inkjet printers produce high
quality output with presentable features. They make
less noise because no hammering is done and these
have many styles of printing modes available. Colour
printing is also possible. Some models of Inkjet printers
can produce multiple copies of printing also.

Fig.1.36 Inkjet Printer

Screen Image Projector - Screen image projector or


simply projector is an output device used to project
information from a computer on a large screen so that a
group of people can see it simultaneously. A presenter
first makes a PowerPoint presentation on the computer.
Now a screen image projector is plugged to a computer
system and presenter can make a presentation to a
group of people by projecting the information on a
large screen. Projector makes the presentation more
understandable.
Fig.1.37 Screen image projector

Speakers and Sound Card - Computers need both a


sound card and speakers to hear audio, such as music,
speech and sound effects. Most motherboards provide
an on-board sound card. This built-in-sound card is fine
for the most purposes. The basic functions of a sound
card are that it converts digital sound signals to analog
for speakers making it louder or softer.

Fig.1.38 Speaker and Soundcard

1.6 Multimedia Software


Multimedia software tells the hardware what to do. For
example, multimedia software tells the hardware to
display the color blue, play the sound of cymbals
crashing etc. To produce these media elements(
movies, sound, text, animation, graphics etc.) there are
various software available in the market such as Paint
Brush, Photo Finish, Animator, Photo Shop, 3D Studio,
Corel Draw, Sound Blaster, IMAGINET, Apple Hyper
Card, Photo Magic, Picture Publisher.
Multimedia Software Categories
Following are the various categories of Multimedia
software

Device Driver Software- These softwares are used to


install and configure the multimedia peripherals.
• Media Players- Media players are applications
that can play one or more kind of multimedia file
format.

Media Conversion Tools- These tools are used for


encoding / decoding multimedia contexts and for
converting one file format to another.
Multimedia Editing Tools- These tools are used for
creating and editing digital multimedia data.

Multimedia Authoring Tools- These tools are used for


combing different kinds of media formats and deliver
them as multimedia contents.

Multimedia Application:
Multimedia applications are created with the help of
following mentioned tools and packages.
The sound, text, graphics, animation and video are the
integral part of multimedia software. To produce and
edit these media elements, there are various software
tools available in the market. The categories of basic
software tools are:

Text Editing Tools- These tools are used to create


letters, resumes, invoices, purchase orders, user
manual for a project and other documents. MS-Word is
a good example of text tool. It has following features:
• Creating new file, opening existing file, saving
file and printing it.
• Insert symbol, formula and equation in the file.
• Correct spelling mistakes and grammatical
errors.
• Align text within margins.
• Insert page numbers on the top or bottom of the
page.
• Mail-merge the document and making letters
and envolpes.
• Making tables with variable number of columns
and rows.
Painting and Drawing Tools- These tools generally
come with a graphical user interface with pull down
menus for quick selection. You can create almost all
kinds of possible shapes and resize them using these
tools. Drawing file can be imported or exported in
many image formats like .gif, .tif, .jpg, .bmp, etc. Some
examples of drawing software are Corel Draw,
Freehand, Designer, Photoshop, Fireworks, Point
etc.These software have following features:
• Tools to draw a straight line, rectangular area,
circle etc.
• Different colour selection option.
• Pencil tool to draw a shape freehand.
• Eraser tool to erase part of the image.
• Zooming for magnified pixel editing.

Image Editing Tools- Image editing tools are used to


edit or reshape the existing images and pictures. These
tools can be used to create an image from scratch as
well as images from scanners, digital cameras, clipart
files or original artwork files created with painting and
drawing tools. Examples of Image editing or
processing software are Adobe Photoshop and Paint
Shop Pro.

Sound Editing Tools- These tools are used to integrate


sound into multimedia project very easily. You can cut,
copy, paste and edit segments of a sound file by using
these tools. The presence of sound greatly enhances
the effect of a mostly graphic presentation, especially
in a video. Examples of sound editing software tools
are: Cool Edit Pro, Sound Forge and Pro Tools. These
software have following features:
• Record your own music, voice or any other
audio.
• Record sound from CD, DVD, Radio or any other
sound player.
• You can edit, mix the sound with any other audio.
• Apply special effects such as fade, equalizer,
echo, reverse and more.

Video Editing Tools- These tools are used to edit, cut,


copy, and paste your video and audio files. Video
editing used to require expensive, specialized
equipment and a great deal of knowledge. The aritistic
process of video editing consists of deciding what
elements to retain, delete or combine from various
sources so that they come together in an organized,
logical and visually planning manner. Today computers
are powerful enough to handle this job, disk space is
cheap and storing and distributing your finished work
on DV][D is very easy. Examples of video editing
software are Adobe Premiere and Adobe After Effects.
Animation and Modeling Tools- An animation is to
show the still images at a certain rate to give it visual
effect with the help of Animation and modeling tools.
These tools have features like multiple windows that
allow you to view your model in each dimension, ability
to drag and drop primitive shapes into a scene, color
and texture mapping, ability to add realistic effects
such as transparency, shadowing and fog etc.
Examples of Animations and modeling tools are 3D
studio max and Maya.

1.7 CD ROM
A CD-ROM (Compact Disc Read-Only Memory) is a
type of optical disc that can store digital data. It is made
up of a thin layer of aluminium or gold, which is coated
with a reflective layer of plastic. The data is encoded in
the form of tiny pits and lands on the disc. The pits are
represented as binary 1s and the lands are represented
as binary 0s.
A CD-ROM drive uses a laser beam to read the data
from the disc. The laser beam is reflected off the disc
and into a photodetector. The photodetector converts
the reflected light into an electrical signal. The
electrical signal is then decoded into the original
digital data.

CD-ROMs can store a lot of data. A standard CD-ROM


can store up to 700 MB of data. This makes them ideal
for storing large files, such as music, video, and
software.
Here are some of the components of a CD-ROM:
• Optical disc: This is the physical disc that stores
the data. It is made up of a thin layer of aluminium
or gold, which is coated with a reflective layer of
plastic.
• Laser beam: This is used to read the data from
the disc. The laser beam is focused on the disc
and reflects off the pits and lands. The reflected
light is then converted into an electrical signal.
• Photodetector: This converts the reflected light
into an electrical signal. The electrical signal is
then decoded into the original digital data.
• Servo system: This controls the speed of the disc
and the focus of the laser beam.
• Control circuit: This controls the operation of the
CD-ROM drive.

Fig.1.39 CD ROM

A CD-ROM is a round, flat disc with a reflective surface.


It is typically made of plastic and has a diameter of 120
millimeters. The data on a CD-ROM is stored in the form
of tiny pits and lands. The pits are represented as
binary 1s and the lands are represented as binary 0s.
CD-ROMs can store a lot of data. A standard CD-ROM
can store up to 700 MB of data. This makes them ideal
for storing large files, such as music, video, and
software.
CD-ROMs are read by a CD-ROM drive. A CD-ROM
drive uses a laser beam to read the data from the disc.
The laser beam is reflected off the disc and into a
photodetector. The photodetector converts the
reflected light into an electrical signal. The electrical
signal is then decoded into the original digital data.
CD-ROMs were first introduced in 1982. They quickly
became popular as a way to distribute music, video,
and software. CD-ROMs are still used today, but they
have been largely replaced by newer storage formats,
such as DVDs and Blu-rays.
Characteristics of CD ROM
• Read-only: CD-ROMs are read-only, meaning
that the data on them cannot be changed or
erased. This is because the data is stored in the
form of tiny pits and lands on the disc, and it is
not possible to change these pits and lands once
they have been created.
• High capacity: CD-ROMs can store a lot of data.
A standard CD-ROM can store up to 700 MB of
data, which is equivalent to about 250,000 pages
of text or 20,000 medium-resolution images.
• Durable: CD-ROMs are relatively durable. They
can withstand scratches and minor physical
damage without losing their data.
• Widely compatible: CD-ROMs are widely
compatible with a variety of devices, including
computers, CD players, and DVD players.
• Size: A standard CD-ROM is 120 mm in diameter
and 1.2 mm thick.
• Speed: CD-ROM drives can read data at speeds
of up to 52x, which means that they can read data
at a rate of 52 times the original speed of a CD-
ROM drive.
• Format: CD-ROMs use the ISO 9660 format to
store data. This format is a standard format for
storing data on CD-ROMs, and it is supported by
most computers and CD players.
Advantages:
• High capacity: CD-ROMs can store a lot of data.
A standard CD-ROM can store up to 700 MB of
data, which is equivalent to about 250,000 pages
of text or 20,000 medium-resolution images.
• Durable: CD-ROMs are relatively durable. They
can withstand scratches and minor physical
damage without losing their data.
• Widely compatible: CD-ROMs are widely
compatible with a variety of devices, including
computers, CD players, and DVD players.
• Low cost: CD-ROMs are relatively inexpensive to
produce. This makes them a cost-effective way to
distribute data.

Disadvantages:
• Read-only: CD-ROMs are read-only, meaning
that the data on them cannot be changed or
erased. This can be a disadvantage if you need
to be able to update the data on a CD-ROM.
• Slow: CD-ROM drives can be slow, especially
when compared to newer storage formats, such
as DVDs and Blu-rays.
• Susceptible to damage: CD-ROMs can be
damaged by scratches, fingerprints, and
exposure to heat or chemicals. This can make
them a less reliable storage medium than other
formats.
1.8 DVD
DVD stands for Digital Versatile Disc. It is a digital
optical disc storage format. It was invented and
developed in 1995 and first released on November
1, 1996, in Japan. The medium can store any kind of
digital data and has been widely used for video
programs (watched using DVD players) or formerly
for storing software and other computer files as well.
DVDs offer significantly higher storage capacity
than compact discs (CD) while having the same
dimensions. A standard DVD can store up to 4.7 GB
of storage, while variants can store up to a maximum
of 17.08 GB.

Fig.1.40 DVD

There are 4 types of DVDs:


• DVD-ROM: Read-only discs that can only be
played on DVD players.
• DVD-R: Recordable discs that can be written to
once.
• DVD-RW: Rewritable discs that can be written to
and erased multiple times.
• DVD+R: Recordable discs that are similar to
DVD-R but have some compatibility advantages.
• DVDs are read by a DVD drive. A DVD drive uses
a laser beam to read the data from the disc. The
laser beam is reflected off the disc and into a
photodetector. The photodetector converts the
reflected light into an electrical signal. The
electrical signal is then decoded into the original
digital data.
• DVDs are still widely used today, but they have
been largely replaced by newer storage
formats, such as Blu-rays. However, DVDs are
still a good option for storing large amounts of
data that does not need to be changed or
updated frequently.
Characteristics of DVDs:
• High capacity: DVDs can store a lot of data. A
standard DVD can store up to 4.7 GB of data,
which is equivalent to about 133 minutes of high-
definition video or 21 hours of standard-
definition video.
• Durable: DVDs are relatively durable. They can
withstand scratches and minor physical damage
without losing their data.
• Widely compatible: DVDs are widely
compatible with a variety of devices, including
computers, DVD players, and Blu-ray players.
• Fast: DVD drives can read data at speeds of up to
16x, which means that they can read data at a
rate of 16 times the original speed of a DVD
drive.
• Interactive: DVDs can be used to store
interactive content, such as games, quizzes, and
menus.
• Size: A standard DVD is 120 mm in diameter and
1.2 mm thick.
• Format: DVDs use the DVD Forum's DVD-Video
format to store video data. This format is a
standard format for storing video on DVDs, and
it is supported by most DVD players.

Types: There are four main types of DVDs: DVD-ROM,


DVD-R, DVD-RW, and DVD+R.
• Encoding: DVDs are encoded using a variety of
codecs, including MPEG-2, H.264, and VC-1.
Advantages:
• High capacity: DVDs can store a lot of data. A
standard DVD can store up to 4.7 GB of data,
which is equivalent to about 133 minutes of high-
definition video or 21 hours of standard-
definition video.
• Durable: DVDs are relatively durable. They can
withstand scratches and minor physical damage
without losing their data.
• Widely compatible: DVDs are widely
compatible with a variety of devices, including
computers, DVD players, and Blu-ray players.
• Fast: DVD drives can read data at speeds of up to
16x, which means that they can read data at a
rate of 16 times the original speed of a DVD
drive.
• Interactive: DVDs can be used to store
interactive content, such as games, quizzes, and
menus.
Disadvantages:
• Not as high-quality as Blu-rays: DVDs have a
lower resolution than Blu-rays, which means that
they do not support as high-quality video or
audio.
• Being replaced by Blu-rays: DVDs are being
replaced by Blu-rays, which have a higher
capacity and higher quality.
• Not as widely compatible as Blu-rays: DVDs are
not as widely compatible as Blu-rays, as not all
devices support them.
UNIT-II
2.1 Multimedia Audio
Multimedia audio is any audio that is used in
conjunction with other media, such as video, images, or
text. It can be used for a variety of purposes, such as to
provide background music, to narrate a video, or to
create an immersive experience in a video game.
Multimedia audio can be recorded in a variety of
formats, including WAV, MP3, AAC, and FLAC. The
format that is chosen will depend on the specific needs
of the project. For example, if the audio needs to be of
high quality, WAV or FLAC may be a good choice. If the
audio needs to be compressed to save space, MP3 or
AAC may be a better option.
Multimedia audio can be used in a variety of
applications, including:
• Video: Audio is an essential part of most videos.
It can be used to provide background music, to
narrate the video, or to create sound effects.
• Video games: Audio is also an important part of
many video games. It can be used to create an
immersive experience for the player, to provide
feedback on the player's actions, or to convey
the story of the game.
• Presentations: Audio can be used to enhance
presentations by making them more engaging
and informative. For example, music can be used
to set the tone for a presentation, while sound
effects can be used to highlight important points.
• Websites and e-learning courses: Audio can also
be used to enhance websites and e-learning
courses. For example, audio can be used to
provide instructions, to explain complex
concepts, or to create a more interactive
experience for the user.
Here are some examples of multimedia audio:
• The background music in a movie
• The voiceover in a documentary
• The sound effects in a video game
• The music and narration in a presentation
• The audio instructions in an e-learning course

2.2 Digital medium


A digital medium is any medium that is created, stored,
or transmitted in digital form. Digital media can be
anything from text and images to audio and video. It can
be accessed and consumed on a variety of devices,
such as computers, smartphones, tablets, and
televisions.
Digital media has had a profound impact on society,
changing the way we communicate, consume
entertainment, and learn. It has also created new
opportunities for businesses and individuals to reach
their audiences.

Here are some examples of digital media:


• Text: emails, documents, websites, social media
posts, etc.
• Images: digital photos, graphics, illustrations,
etc.
• Audio: music, podcasts, audiobooks, etc.
• Video: movies, TV shows, YouTube videos, etc.
Digital media has a number of advantages over
traditional media, such as print and broadcast media.
Digital media is more interactive, allowing users to
engage with content in new ways. It is also more
scalable, meaning that it can be easily distributed to a
large audience. Additionally, digital media is more
affordable to produce and consume.
Here are some of the benefits of digital media:
• Interactivity: Digital media allows users to
interact with content in new ways. For example,
users can comment on blog posts, share videos
on social media, and play video games.
• Scalability: Digital media is easily scalable,
meaning that it can be distributed to a large
audience with minimal effort.
• Affordability: Digital media is more affordable to
produce and consume than traditional media.
For example, it is relatively inexpensive to
create and distribute a website or a YouTube
video.
• Accessibility: Digital media is more accessible
than traditional media. For example, people can
access digital content from anywhere in the
world with an internet connection.
2.3 Digital audio technology
Digital audio technology is the use of digital signals to
represent, record, edit, and reproduce sound. Digital
audio signals are made up of a series of binary
numbers, which can be stored and processed on
computers and other digital devices.
Digital audio technology has a number of advantages
over traditional analog audio technology. Digital audio
is more accurate and less susceptible to noise and
distortion. It is also easier to edit and reproduce. Digital
audio technology has revolutionized the way we
record, produce, and consume music and other audio
content.
Here are some of the key components of digital audio
technology:
• Analog-to-digital converters (ADCs): ADCs
convert analog audio signals, such as those
from a microphone or record player, into
digital signals.
• Digital-to-analog converters (DACs): DACs
convert digital audio signals back into analog
signals, so that they can be played back on
speakers or headphones.
• Digital signal processors (DSPs): DSPs are
used to process digital audio signals, such as
to apply effects, equalize the sound, and
reduce noise.
• Digital audio workstations (DAWs): DAWs
are software applications that allow users to
record, edit, and produce digital audio.
• Digital audio technology is used in a wide
variety of applications, including:
• Music production: Digital audio technology is
used to record, edit, and produce music in all
genres.
• Broadcasting: Digital audio technology is
used to broadcast AM and FM radio signals,
as well as digital radio signals such as DAB
and HD Radio.
• Film and television: Digital audio technology
is used to record and produce the
soundtracks for movies and television shows.
• Video games: Digital audio technology is
used to create the sound effects and music for
video games.
• Consumer electronics: Digital audio
technology is used in a variety of consumer
electronics devices, such as smartphones,
tablets, and MP3 players.

2.4 Digitization of Sound


Digitization is a process of converting the analog
signals to a digital signal. There are three steps of
digitization of sound.
Fig.2.1 Digitization of sound
2.4.1 Sampling - Sampling is a process of measuring
air pressure amplitude at equally spaced moments in
time, where each measurement constitutes a sample. A
sampling rate is the number of times the analog sound
is taken per second. A higher sampling rate implies that
more samples are taken during the given time interval
and ultimately, the quality of reconstruction is better.
The sampling rate is measured in terms of Hertz, Hz in
short, which is the term for Cycle per second. A
sampling rate of 5000 Hz(or 5kHz,which is more
common usage) implies that mt uj vu8i 9ikuhree
sampling rates most often used in multimedia are
44.1kHz(CD-quality), 22.05kHz and 11.025kHz.
2.4.2 Quantization - Quantization is a process of
representing the amplitude of each sample as integers
or numbers. How many numbers are used to represent
the value of each sample known as sample size or bit
depth or resolution. Commonly used sample sizes are
either 8 bits or 16 bits. The larger the sample size, the
more accurately the data will describe the recorded
sound. An 8-bit sample size provides 256 equal
measurement units to describe the level and frequency
of the sound in that slice of time. A 16-bit sample size
provides 65,536 equal units to describe the sound in
that sample slice of time. The value of each sample is
rounded off to the nearest integer (quantization) and if
the amplitude is greater than the intervals available,
clipping of the top and bottom of the wave occurs.

2.4.3 Encoding - Encoding converts the integer base-


10 number to a base-2 that is a binary number. The
output is a binary expression in which each bit is either
a 1(pulse) or a 0(no pulse).
Fig.2.2 Encoding

2.4.4 Quantization of Audio


Quantization is a process to assign a discrete value from
a range of possible values to each sample. Number of
samples or ranges of values are dependent on the
number of bits used to represent each sample.
Quantization results in stepped waveform resembling
the source signal.
• Quantization Error/Noise - The difference
between sample and the value assigned to it is
known as quantization error or noise.
• Signal to Noise Ratio (SNR) - Signal to Ratio
refers to signal quality versus quantization error.
Higher the Signal to Noise ratio, the better the
voice quality. Working with very small levels
often introduces more error. So instead of
uniform quantization, non-uniform quantization
is used as companding. Companding is a
process of distorting the analog signal in
controlled way by compressing large values at
the source and then expanding at receiving end
before quantization takes place.

Fig.2.3 Quantization of Audio

2.4.5 Transmission of Audio


In order to send the sampled digital sound/ audio over
the wire that it to transmit the digital audio, it is first to
be recovered as analog signal. This process is called
de-modulation.
• PCM Demodulation - PCM Demodulator reads
each sampled value then apply the analog filters
to suppress energy outside the expected
frequency range and outputs the analog signal as
output which can be used to transmit the digital
signal over the network.

Fig. 2.4 Transmission of audio

2.5 Sound card


A sound card is an expansion card that converts digital
audio signals into analog signals that can be played
back through speakers or headphones. It also allows
users to record analog audio signals, such as those from
a microphone, and convert them into digital signals that
can be stored or edited on a computer.
chevron_ right Internal sound card

Sound cards are an essential component of any


computer that is used to play or record audio. They are
also used in many other devices, such as smartphones,
tablets, and TVs.
Here are some of the benefits of using a sound card:
• Improved audio quality: Sound cards can
provide better audio quality than the built-in
audio on most computers. This is because they
use dedicated hardware to process digital audio
signals.
• More features: Sound cards often come with a
variety of features that are not available on built-
in audio, such as surround sound support, noise
reduction, and equalization.
• More flexibility: Sound cards can be upgraded
or replaced as needed, giving users more
flexibility in terms of their audio needs.
Sound cards are available in both internal and external
form factors. Internal sound cards are installed inside
the computer case, while external sound cards are
connected to the computer via a USB or Thunderbolt
port.
Internal sound cards are typically more affordable than
external sound cards, but they can be more difficult to
install. External sound cards are easier to install and
often offer better audio quality, but they can be more
expensive.
Here are some examples of popular sound cards:
• Creative Sound Blaster Z: This is a popular
internal sound card that offers good audio
quality and a variety of features, such as
surround sound support and noise reduction.
• AudioQuest DragonFly Cobalt: This is a popular
external sound card that is known for its
excellent audio quality.
• Focusrite Scarlett 2i2: This is a popular external
sound card that is ideal for recording and
producing music.

2.5.1.Sound Card Description


A sound card is a hardware in rectangular shape that
contains different ports on the side to connect audio
devices, like a speaker, and also has multiple contacts
on the bottom of the card. As the motherboard,
peripheral cards and case are designed with
compatibility in mind; therefore, at the time of installing
the sound card it just fits outside the back of the case.
This makes it capable of easily available for use. You
also have an option with a sound card to plug
microphones, headphones, and also other audio
devices into your computer; because there are
also USB sound cards available. Also, you can plug it
directly into a USB port with the help of a small adapter.
In the past time, when using a narrow range of
frequencies, computers were originally only able to
produce beeps. Mainly, these beeps were used in the
form of warning alarms.
Over time, for both professional and entertainment
reasons, the need for high-quality sound increased by
increasing growth in multimedia. A sound card AdLib
was created to fill this need. In the AdLib sound card,
the percussion mode and a 9-voice mode features were
available that made programmable audio possible.
For computers, most of the motherboard manufacturers
provide built-in sound cards. However, advanced
users, instead of generic, built-in cards, commonly use
expansion cards selected to meet their particular
requirements.
2.5.2 History of the sound card
The sound card, capable of 4-voice music synthesis,
Gooch Synthetic Woodwind is considered the first
sound card. It was used by PLATO terminals, which was
invented by Sherwin Gooch in 1972.
AdLib was one of the first companies that began to
manufacture sound cards or IBM PC-compatible
computers. In 1987, on the basis of the Yamaha YM3812
sound chip, AdLib developed the Music Synthesizer
Card.
Although until 1988, sound cards were very uncommon
for the IBM PC. For the majority of IBM PC users to
produce sound and music, the internal PC speaker was
the only way. Consequently, basing "beeps and
boops," sound was described that was led to the
common nickname "beeper. At the time sounds were
played, there was a need to stop all other processing.
The Consumer Electronics Show that the PC's were
unable to become the leading home computer as they
only had limited sound, it was stated by a panel of
computer-game CEOs in 1988. As compared to current
products, it required a $49-79 sound card with better
capability. In 1989, it was founded in a Computer
Gaming World survey that AdLib, six Roland and
Covox, and seven Creative Music System/Game
Blaster were planned 18 of 25 game companies to
support.

2.5.3 Types of Sound Cards


The sound card is an expansion component in the
computer that makes capable you to hear the sound,
which comes from video files, mp3 file, and more other.
In the late 1980s and early 1990s, sound cards first
started to enter the mainstream. In modern times,
almost all computers come with one. Mainly, sound
cards have three types, and all contain their own
advantages.
Motherboard Sound Chips
The sound cards were costly add-on cards when they
were introduced for the first time. Its cost was hundreds
of dollars. When the computer sound technology
became available at a low price, miniaturization
technology allowed computer hardware manufacturers
to produce sound into a single chip. In modern times,
there is a rare chance to find a computer not containing
motherboard sound chip. Even if they only contain a
separate sound card. The motherboard sound chips
made sound card affordable for all computer owners.
You can identify if your system has a motherboard
sound chip.

Standard Sound Cards


Inside the computer, a standard sound card connects to
one of the slots. Using a sound card rather than
motherboard sound chip, offer a benefit as it contains
its own processor chips. And, a motherboard sound
chip produce sound on the basis of the computer
processor. When playing games, a standard sound
card offers better performance as it creates less of a
load on the main processor.

External Sound Adapters


An external sound adapter has all the same features like
standard sound card. It is a small box that enables
connection to computer with the help of USB or
FireWire port, instead of an internal expansion slot.
Sometimes, it contains a feature that is not included by
a standard sound card, such as physical volume control
knobs and extra inputs and outputs. As compared to the
standard sound card, it is much easier to move an
external sound adapter to a new computer. Also, with
USB or FireWire expansion slots, it is the only way to
upgrade the sound of a laptop.

2.5.4 Uses of a sound card


The primary use of a sound card is to provide sound that
you hear from playing music with varying formats and
degrees of control. The source of the sound may be in
the form of streamed audio, a file, CD or DVD, etc.
There are many applications of a computer where a
sound card can be used, which areas are as follows:
o Games.
o Voice recognition.
o Watch movies.
o Creating and playing MIDI.
o Educational software.
o Audio and video conferencing.
o Business presentations.
o Record dictations.
o Audio CDs and listening to music.

2.5.5 Sound card connections


The image is shown on the right-side describing sound
card audio ports or audio jacks, which connectors are
found back of your computer. This picture is an
example of sound card audio ports.
o With surround sound or loudspeakers, digital
out is used (white or yellow; words: "Digital" or
"Digital Out").
o Connection for external audio sources, such as
tape recorder, record player, or CD player,
sound in or line in (blue; Arrow pointing into
waves).
o The connection is for headphones or
microphones, Mic or Microphone (pink)
o For your speakers or headphones, the primary
sound connection, sound out or line out (green;
Arrow pointing out of waves). The second
(black) and third (orange) sound-out connectors
are also contained by this sound card.
o For digital video cameras and other devices,
some high-quality sound cards are used,
FireWire (not pictured).
o For connecting MIDI keyboard or joystick, MIDI
or joystick (15 pins yellow connector) is used
with older sound cards.

2.5.6 Sound Cards and Audio Quality


Instead of having a sound expansion cards, many
modern cards have the same technology integrated
directly onto the motherboard. These cards are known
as on-board sound cards. But this configuration makes
slightly less powerful audio system and allows for a less
expensive computer. Almost, this way is appropriate
for all computer users. Usually, dedicated sound cards
are necessary for the serious audio professional. To
share a common ground wire, since most of the desktop
computers are set up for the front-facing headphone
jacks and ports. So, if you also have USB devices
plugged in, you may hear static in your headphones.
A Computer has no Sound
Although, it may happen the sound card or speakers
are no longer communicating with each other as it is
possible that they have disconnected from their
ports/power. Usually, it can be an issue related to
software that preventing the sound from playing. First,
you make sure the volume of the song, movie, video, or
which you are going to listen to.
A sound could be from a missing or corrupt device
driver, which can be another reason for not delivering
the sound. With the help of using any free driver
updater tools, installing the sound card driver is the
best way to overcome this problem.

2.6 Sound card recording and editing


To record and edit audio using a sound card, you will
need the following:
• A computer with a sound card installed
• Recording software
• A microphone or other audio input device
Once you have all of the necessary equipment, you can
follow these steps to record and edit audio:
1. Connect the microphone or other audio input
device to the sound card.
2. Open the recording software and select the
sound card as the input device.
3. Click the record button to start recording.
4. Once you have finished recording, click the stop
button.
5. Play back the recording to make sure that it is
what you want.
6. If you are satisfied with the recording, you can
save it to your computer.
To edit the recording, you can use the recording
software to trim it, add effects, or mix it with other audio
sources.
Here are some tips for recording and editing audio
using a sound card:
• Use a high-quality microphone or other audio
input device to get the best possible sound
quality.
• Record in a quiet environment to reduce
background noise.
• Use the recording software to adjust the input
level so that the recording is not too loud or too
soft.
• Experiment with different effects and mixing
techniques to achieve the desired sound.
• Save the recording in a high-quality audio
format, such as WAV or FLAC, to preserve the
sound quality.
There are a variety of recording and editing software
programs available, both free and paid. Some popular
options include:
• Audacity (free)
• Reaper (free and paid versions)
• Adobe Audition (paid)
• Pro Tools (paid)
Once you have recorded and edited your audio, you
can share it with others by uploading it to a file sharing
service or publishing it to a website. You can also use it
to create videos, podcasts, or other multimedia content.
2.7 MP3
MP3, or MPEG-1 Audio Layer III, is a digital audio
format that uses a lossy compression algorithm to
reduce the size of audio files. MP3 files are typically
much smaller than uncompressed audio files, but they
still retain a high level of quality.
MP3 is the most popular audio format in the world, and
it is supported by a wide range of devices, including
computers, smartphones, tablets, and MP3 players.
MP3 files are also commonly used for streaming audio
over the internet.

Here are some of the benefits of using MP3:


• Small file size: MP3 files are much smaller than
uncompressed audio files, making them easier
to store and transfer.
• Wide compatibility: MP3 is supported by a wide
range of devices, making it a versatile format.
• High quality: MP3 files can still retain a high level
of quality, even at low bit rates.
MP3 files are created using a lossy compression
algorithm. This means that some of the original audio
data is lost during the compression process. However,
the loss is typically imperceptible to the human ear,
even at low bit rates.

MP3 files are typically encoded at a bit rate of 128 kbps,


192 kbps, or 256 kbps. Higher bit rates result in higher
quality audio, but they also produce larger file sizes.

MP3 files can be played on a variety of devices,


including computers, smartphones, tablets, and MP3
players. MP3 files are also commonly used for
streaming audio over the internet.

To play an MP3 file, you will need a device that supports


MP3 playback. Most computers, smartphones, tablets,
and MP3 players support MP3 playback.

If you are using a computer to play an MP3 file, you can


use a media player such as Windows Media Player,
iTunes, or VLC. If you are using a smartphone or tablet,
you can use a music player app such as Spotify, Apple
Music, or YouTube Music.

To stream an MP3 file over the internet, you can use a


streaming service such as Spotify, Apple Music, or
YouTube Music.
Here are some tips for playing MP3 files:
• Use a high-quality pair of headphones or
speakers to get the best possible sound.
• Adjust the volume to a comfortable level.
• If you are streaming an MP3 file, make sure that
you have a strong internet connection.
MP3 is a versatile and popular audio format that is
supported by a wide range of devices. MP3 files are a
great way to store and play music, podcasts, and other
audio content.

2.8 MIDI
MIDI, or Musical Instrument Digital Interface, is a
standard protocol for connecting and communicating
between electronic musical instruments and
computers. It allows devices to send and receive
messages about notes, tempo, and other musical
parameters. MIDI is a digital format, which means that
the information is transmitted as a series of bits. This
makes MIDI very efficient and reliable, and it is also
very versatile. MIDI can be used to control a wide range
of devices, including synthesizers, samplers, drum
machines, and lighting systems.
Here is a basic diagram of a MIDI system:

Fig. 2.6 MIDI

The diagram shows the following components:


• MIDI controller: This is a device that generates
MIDI messages. It can be a keyboard, a guitar
controller, or any other device that has MIDI
output.
• MIDI interface: This is a device that connects the
MIDI controller to the computer.
• Computer: The computer is used to record, edit,
and generate MIDI data.
• MIDI sound module: This is a device that
converts MIDI messages into audio signals. It can
be a synthesizer, a sampler, or any other device
that has MIDI input.

MIDI messages are made up of three parts:


• Status byte: This byte tells the receiving device
what type of message it is.
• Data byte 1: This byte contains additional
information about the message, such as the note
number or the velocity.
• Data byte 2: This byte contains additional
information about the message, if necessary.

There are many different types of MIDI messages, but


some of the most common include:
• Note on: This message tells the receiving device
to play a note.
• Note off: This message tells the receiving device
to stop playing a note.
• Control change: This message tells the receiving
device to change a parameter, such as the
volume or the filter cutoff.
• Program change: This message tells the
receiving device to change to a new program,
such as a new synthesizer patch.
MIDI is a powerful tool for creating music. It allows
musicians to control a wide range of devices with a
single controller, and it makes it easy to record and edit
music. MIDI is also used in many other applications,
such as video games and film scoring.

Here are some of the benefits of using MIDI:


• Versatility: MIDI can be used to control a wide
range of devices, from synthesizers to lighting
systems.
• Efficiency: MIDI is a very efficient way to transmit
musical information.
• Reliability: MIDI is a very reliable format, and it
is not susceptible to noise or interference.
• Flexibility: MIDI makes it easy to record and edit
music.
• Portability: MIDI files are portable and can be
easily shared between different devices and
software programs.

2.8.1 Working with MIDI


To work with MIDI, you will need the following:
• A MIDI controller
• A MIDI interface
• A computer
• A MIDI sequencer

A MIDI controller is a device that generates MIDI


messages. It can be a keyboard, a guitar controller, or
any other device that has MIDI output.

A MIDI interface is a device that connects the MIDI


controller to the computer.
A MIDI sequencer is a software program that allows you
to record, edit, and generate MIDI data.

Once you have all of the necessary equipment, you can


follow these steps to work with MIDI:
1. Connect the MIDI controller to the MIDI
interface.
2. Connect the MIDI interface to the computer.
3. Open the MIDI sequencer.
4. Create a new track and select the MIDI controller
as the input device.
5. Record your MIDI performance.
6. Edit your MIDI performance, if desired.
7. Play back your MIDI performance.

You can also use MIDI to control other devices, such as


synthesizers, samplers, and drum machines. To do this,
you will need to connect the MIDI sequencer to the
other devices using MIDI cables.

Here are some tips for working with MIDI:


• Use a high-quality MIDI controller to get the best
possible performance.
• Experiment with different MIDI settings to
achieve the desired sound.
• Use the MIDI sequencer to record and edit your
performance.
• Use MIDI to control other devices to create more
complex and interesting music.

MIDI is a powerful tool for creating music. It can be


used to control a wide range of devices, and it makes it
easy to record and edit music. MIDI is also used in many
other applications, such as video games and film
scoring.

Here are some examples of how MIDI can be used in


music production:
• Recording a keyboard performance: You can
use a MIDI controller to record a keyboard
performance into a MIDI sequencer. The MIDI
sequencer will record the note numbers,
velocities, and other MIDI data. You can then edit
the performance in the MIDI sequencer and play
it back using a synthesizer or other MIDI device.
• Creating a drum beat: You can use a MIDI
sequencer to create a drum beat by
programming the note numbers and velocities
for each drum sound. You can also use a drum
machine to generate MIDI messages, which you
can then record into a MIDI sequencer.
• Controlling a synthesizer: You can use a MIDI
sequencer to control a synthesizer by sending
MIDI messages to change the patch, volume,
filter cutoff, and other parameters. This allows
you to create complex and evolving sounds.
• Synchronizing multiple devices: You can use
MIDI to synchronize multiple devices, such as a
sequencer, a synthesizer, and a drum machine.
This allows you to create music that is in perfect
time.

2.9 Audio file formats


Audio file formats are specific file formats used to store
and encode audio data. These formats differ in terms of
quality, compression, compatibility, and other factors.
Here are some common audio file formats:
WAV (Waveform Audio File Format):
• Description: WAV is an uncompressed audio
format known for its high-quality audio, often
used in professional audio applications.
• Pros: Lossless, high-quality audio; suitable for
editing and professional use.
• Cons: Large file size.
MP3 (MPEG Audio Layer III):
• Description: MP3 is a widely used audio format
known for its high compression while
maintaining reasonable audio quality.
• Pros: Small file size, widely supported, good
audio quality at higher bitrates.
• Cons: Lossy compression (some quality loss at
lower bitrates).
AAC (Advanced Audio Coding):
• Description: AAC is a lossy audio format often
used for high-quality audio compression,
popular with Apple devices and iTunes.
• Pros: Good audio quality, efficient compression,
support for various channels.
• Cons: Lossy compression (similar to MP3).
FLAC (Free Lossless Audio Codec):
• Description: FLAC is a lossless audio format
known for its high-quality compression without
any loss of audio fidelity.
• Pros: Lossless quality, smaller file size compared
to WAV.
• Cons: Less compatibility with some devices and
software.
OGG (Ogg Vorbis):
• Description: OGG is a free and open-source
audio format with lossy compression, often used
for streaming and gaming.
• Pros: Open standard, good audio quality,
smaller file sizes.
• Cons: Less support in some devices and software
compared to MP3.
AIFF (Audio Interchange File Format):
• Description: AIFF is an uncompressed audio
format commonly used on Apple devices and for
professional audio production.
• Pros: Lossless quality, widely supported on
Apple platforms.
• Cons: Larger file sizes compared to compressed
formats.
MIDI (Musical Instrument Digital Interface):
• Description: MIDI is not an audio format but a
protocol that represents musical notes and
commands for synthesizers and MIDI-
compatible devices.
• Pros: Extremely small file sizes, can be edited
and customized.
• Cons: Doesn't store audio but instructions for
playback.
WMA (Windows Media Audio):
• Description: WMA is a proprietary audio format
developed by Microsoft, known for its variable
bitrates and good compression.
• Pros: Good compression, reasonable audio
quality.
• Cons: Proprietary format with limited
compatibility outside Windows environments.
APE (Monkey's Audio):
• Description: APE is a lossless audio format
known for high compression ratios and
maintaining audio quality.
• Pros: Excellent compression, lossless quality.
• Cons: Less common and supported compared to
FLAC or WAV.

AMR (Adaptive Multi-Rate):


• Description: AMR is a format commonly used for
speech encoding and mobile voice recordings.
• Pros: Excellent compression for voice
recordings.
• Cons: Limited use for music or high-quality
audio.

2.10 Adding sound to a multimedia project


Adding sound to a multimedia project is a crucial step
to enhance the overall user experience. Whether you
are working on a video presentation, a website, a
game, or any other multimedia project, here are some
steps and considerations for adding sound effectively:
Plan Your Audio Content:
• Determine the purpose of audio in your project.
Is it for narration, background music, sound
effects, or interactive elements?
• Identify key moments or scenes where audio
should be added to complement the visuals.
Choose the Right Audio Files:
• Select audio files in appropriate formats (e.g.,
MP3, WAV, AAC) that match your project's
needs.
• Ensure that the audio quality is suitable for the
project and that you have the necessary rights or
licenses for the audio content.
Editing and Processing:
• Use audio editing software (e.g., Audacity,
Adobe Audition) to trim, edit, and enhance your
audio files.
• Adjust volume levels, remove background
noise, and apply any necessary effects to
improve audio quality.
Synchronize with Visuals:
• Align audio events with specific moments in your
multimedia project. This synchronization is
crucial for conveying the right message or
mood.
• In video editing software, use a timeline to
precisely synchronize audio with visual
elements.
Narration and Voiceovers:
• If your project includes narration or voiceovers,
ensure that the voice talent delivers the script
clearly and professionally.
• Match the pace and tone of the narration to the
project's overall style and message.
Background Music:
• Choose background music that enhances the
mood or atmosphere of your project.
• Ensure that the music volume is balanced with
other audio elements and doesn't overpower
speech or other important sounds.
Sound Effects:
• Incorporate sound effects where needed to add
realism or emphasize actions in your project.
• Use appropriate sound libraries or create your
own sound effects if necessary.

Interactive Elements:
• For interactive multimedia, consider how users
will interact with audio elements.
• Implement interactive audio triggers or
responses based on user actions or choices.
Testing and Feedback:
• Play your multimedia project with the added
audio to ensure everything works as intended.
• Seek feedback from others to evaluate the
impact of the audio on the overall experience.
Optimization and Compression:
• Compress audio files if necessary to reduce
loading times, especially for web-based
multimedia.
• Balance quality and file size to ensure smooth
playback.
Accessibility:
• Provide options for users to control audio
playback, such as volume control or mute
buttons.
• Consider accessibility features, such as
providing captions or transcripts for audio
content.
Final Review and Export:
• Perform a final review of your multimedia project
with audio to catch any issues or inconsistencies.
• Export the project in the desired format,
ensuring that audio settings are optimized for the
target platform or medium.
Documentation:
• Document the audio elements used in your
project, including sources, formats, and
licensing information, for future reference.
• Adding sound to a multimedia project can
greatly enhance its impact and engagement. By
carefully planning, editing, and synchronizing
audio elements
UNIT-III

3.1 Multimedia Text


Multimedia text refers to content that combines various
forms of media, such as text, images, audio, video, and
interactive elements, to convey information or tell a
story. This type of content is commonly used in digital
communication, marketing, education, entertainment,
and many other fields. Multimedia text can be found in
various formats, including:

Websites: Most websites today incorporate


multimedia elements, such as text articles, images,
videos, and interactive features like animations or
games.

E-books: E-books often include text, images, and


sometimes even multimedia elements like audio or
video clips to enhance the reading experience.
Digital Magazines: Digital magazines combine text
with multimedia elements like videos, animations, and
interactive graphics to engage readers.

Presentations: Presentation software like Microsoft


PowerPoint or Google Slides allows users to create
multimedia-rich presentations that combine text,
images, and videos.

Social Media: Social media platforms enable users to


share multimedia text in the form of posts, which can
include text, images, videos, and links.

Educational Materials: Educational resources often


incorporate multimedia to make learning more
engaging and effective. This includes interactive
textbooks, online courses, and educational videos.

Advertising and Marketing: Multimedia is


extensively used in advertising and marketing
campaigns, from text-based advertisements with
images to full-blown video commercials.
News and Journalism: News articles often include
images, videos, and interactive graphics to provide a
more comprehensive understanding of a story.

Entertainment: Multimedia is a core component of


entertainment, from movies and TV shows to video
games, which combine storytelling with audiovisual
elements.

Interactive Learning: Multimedia text can also be


used for interactive learning experiences, such as
language learning apps that combine text, audio, and
interactive exercises.

Multimedia text can be created using a variety of


different software applications, such as:
• Microsoft PowerPoint
• Google Slides
• Adobe InDesign
• Adobe Illustrator
• Adobe Photoshop

Multimedia text can also be created using online tools,


such as:
• Canva
• Piktochart
• Sway

When creating multimedia text, it is important to


consider the following:
Audience: Who is your target audience? What are their
needs and interests?
Purpose: What is the purpose of your multimedia text?
What message do you want to communicate?
Content: What type of content will you be using in your
multimedia text? How will you combine the different
elements of your multimedia text to create a cohesive
and engaging experience for your audience?
Design: How will you design your multimedia text?
What colors, fonts, and images will you use? How will
you arrange the different elements of your multimedia
text on the page or screen?
3.2 Image
What is an image?
An image is defined as a two-dimensional
function,F(x,y), where x and y are spatial coordinates,
and the amplitude of F at any pair of coordinates (x,y)
is called the intensity of that image at that point. When
x,y, and amplitude values of F are finite, we call it
a digital image.

In other words, an image can be defined by a two-


dimensional array specifically arranged in rows and
columns.

Digital Image is composed of a finite number of


elements, each of which elements have a particular
value at a particular location. These elements are
referred to as picture elements, image elements, and
pixels. A Pixel is most widely used to denote the
elements of a Digital Image.
3.2.1 Types of an image
1. BINARY IMAGE– The binary image as its name
suggests, contain only two pixel elements i.e 0 &
1,where 0 refers to black and 1 refers to white. This
image is also known as Monochrome.
2. BLACK AND WHITE IMAGE– The image which
consist of only black and white color is called BLACK
AND WHITE IMAGE.
3. 8 bit COLOR FORMAT– It is the most famous
image format.It has 256 different shades of colors in
it and commonly known as Grayscale Image. In this
format, 0 stands for Black, and 255 stands for white,
and 127 stands for gray.
4. 16 bit COLOR FORMAT– It is a color image
format. It has 65,536 different colors in it. It is also
known as High Color Format. In this format the
distribution of color is not as same as Grayscale
image.
A 16 bit format is actually divided into three further
formats which are Red, Green and Blue. That famous
RGB format.

3.3 Coloring

Fig. 3.1. Colouring

3.3.1 Color lookup tables


A color loop-up table (LUT) is a mechanism used to
transform a range of input colors into another range of
colors. Color look-up table will convert the logical
color numbers stored in each pixel of video memory
into physical colors, represented as RGB triplets, which
can be displayed on a computer monitor. Each pixel of
image stores only index value or logical color number.
For example if a pixel stores the value 30, the meaning
is to go to row 30 in a color look-up table (LUT). The LUT
is often called a Palette.

3.3.2 Characteristic of LUT are following:


• The number of entries in the palette determines
the maximum number of colors which can
appear on screen simultaneously.
• The width of each entry in the palette determines
the number of colors which the wider full palette
can represent.
A common example would be a palette of 256 colors
that is the number of entries is 256 and thus each entry
is addressed by an 8-bit pixel value. Each color can be
chosen from a full palette, with a total of 16.7 million
colors that is the each entry is of 24 bits and 8 bits per
channel which sets the total combinations of 256 levels
for each of the red, green and blue components 256 x
256 x 256 =16,777,216 colors.

3.3.3 Colouring techniques


Coloring techniques in multimedia can be used to
create a variety of different effects, from simple to
complex. Some common coloring techniques in
multimedia include:
Hatching and cross-hatching: These techniques are
used to create different shades and textures using
parallel lines. They can be used to create realistic
effects, such as shading on a face or fur on an animal.

Stippling: This technique uses dots to create different


shades and textures. It can be used to create soft and
delicate effects, such as flower petals or a baby's skin.

Layering: This technique involves applying multiple


layers of color to create different effects. It can be used
to create depth and richness in an image, or to create
special effects, such as a glowing halo or a neon light.

Blending: This technique involves mixing two or more


colors together to create a new color. It can be used to
create smooth transitions between colors, or to create
unique and interesting effects, such as a rainbow or a
nebula.

Illustrations: Illustrations can be used in books,


magazines, websites, and other types of media.
Coloring techniques can be used to make illustrations
more visually appealing and engaging.

Animations: Animations can be used in movies, TV


shows, video games, and other types of media.
Coloring techniques can be used to create realistic and
dynamic animations.

Visual effects: Visual effects can be used in movies, TV


shows, and video games to create special effects, such
as explosions, magic, and other fantastical elements.
Coloring techniques can be used to make visual effects
more realistic and believable.

3.4 Image file formats


GIF (Graphics Interchange Format): Supports up to
256 colors, widely used for icons and simple diagrams.
Good for images with limited colors and animations.

JPEG (Joint Photographic Experts Group): Ideal for


photographs, uses lossy compression for smaller file
sizes. Standard for web images and digital cameras.

PNG (Portable Network Graphics): Supports 8, 24,


32, and 48-bit color, includes partial transparency.
Offers lossless compression and is superior to GIF.

TIFF (Tagged Image File Format): Flexible format


supporting various image types, from 1-bit to 24-bit
color. Can use lossless or lossy compression.

BMP (Bitmap): A basic format for Windows


applications, uncompressed. Not suitable for the
internet but can be compressed.
EPS (Encapsulated Postscript): A vector-based
format suitable for printed documents. Can be
imported into many applications.
PDF (Portable Document Format): Contains both
vector and pixel graphics with compression options.
Platform-independent and widely used for sharing
documents.

EXIF (Exchange Image File): Specifically for digital


cameras, stores camera and picture-taking information.

WMF (Windows MetaFile): Vector format for MS-


Windows, best viewed in Windows software.

PICT: Used in Macintosh software development but not


recommended for desktop publishing due to potential
corruption issues.

Photoshop: Adobe's native file format for Photoshop.


Commonly used for image editing and can be imported
into various applications.
3.5 Multimedia graphics
Multimedia graphics are images, drawings, and
animations that are used in multimedia presentations.
They can be used to enhance the visual appeal of a
presentation, to explain concepts, to tell stories, or to
engage users.Here are some common types of
multimedia graphics:
1. Images: Static images, such as photographs,
illustrations, diagrams, and infographics, are
widely used in multimedia content to convey
information, set the mood, or provide context.
They can be found in websites, presentations, e-
books, and more.
2. Icons: Icons are simplified graphical
representations of objects, actions, or concepts.
They are commonly used in user interfaces,
websites, and applications to provide visual cues
and navigation elements.
3. Charts and Graphs: Multimedia graphics often
include charts, graphs, and data visualizations to
present complex information in a visually
accessible format. These are commonly used in
reports, presentations, and infographics.
4. Animations: Animations involve the movement
of graphical elements. They can be simple, like
animated GIFs, or more complex, such as 2D or
3D animations in videos, games, or interactive
multimedia presentations.
5. Videos: Videos are a powerful form of
multimedia graphics that combine moving
images, audio, and sometimes text to tell a story
or convey information. They are used in various
contexts, from entertainment to marketing and
education.
6. Interactive Graphics: Interactive graphics
allow users to engage with content by clicking,
dragging, or otherwise interacting with visual
elements. Examples include interactive maps,
product configurators, and data-driven
simulations.
7. Vector Graphics: Vector graphics are created
using mathematical formulas to define shapes
and lines. They can be scaled infinitely without
losing quality and are commonly used for logos,
illustrations, and animations.
8. 3D Graphics: Three-dimensional graphics are
used in video games, virtual reality (VR), and
simulations to create immersive environments
and experiences. They are also used in product
design and architectural visualization.
9. Web Graphics: Multimedia graphics on
websites include banner images, buttons,
backgrounds, and decorative elements that
contribute to the overall look and feel of a web
page.
10. Augmented Reality (AR) and Virtual Reality
(VR): AR and VR applications use multimedia
graphics extensively to create immersive
experiences. These graphics can range from 3D
models and animations to overlaying digital
information on the real world in AR.
11. Typography: While not traditionally thought of
as a graphic, typography plays a crucial role in
multimedia design. The choice of fonts, styles,
and layout can greatly impact the visual appeal
and readability of multimedia content.

Fig. 3.2. Multimedia graphics diagram

3.6 Development and editing image in


multimedia
Development and editing of images are critical
components of multimedia content creation. Whether
you're creating a website, a video, a presentation, or
any other multimedia project, images play a significant
role in conveying information and engaging your
audience. Here's a step-by-step guide to the
development and editing of images in multimedia:
1. Image Acquisition:
• Capture: Start by capturing images using digital
cameras, smartphones, or other imaging
devices. Pay attention to composition, lighting,
and focus when taking photographs.
• Selection: Choose the best images that align
with your multimedia project's theme, message,
or story.

2. Image Import:
• Transfer: Transfer the selected images to your
computer or multimedia editing software.
Ensure they are in a suitable file format (e.g.,
JPEG, PNG) and resolution.

3. Image Editing:
• Crop and Resize: Use image editing software
like Adobe Photoshop or GIMP to crop and
resize images to fit the dimensions required for
your multimedia project.
• Color Correction: Adjust brightness, contrast,
saturation, and color balance to enhance the
visual appeal of your images.
• Retouching: Remove imperfections, blemishes,
or unwanted objects from your images using
retouching tools.
• Filters and Effects: Apply filters and special
effects to create a specific visual style or mood
for your multimedia project.
• Layering: Use layers to add text, graphics, or
other elements on top of your images.
• Transparency: Make use of transparency and
opacity settings to blend images seamlessly into
your project.

4. Image Optimization:
• Compression: Optimize image file sizes for
faster loading on the web by using appropriate
compression techniques. Balancing quality and
file size is essential.
• File Format: Choose the right file format (e.g.,
JPEG, PNG, GIF) based on the type of image and
the intended use in your multimedia project.
5. Image Enhancement:
• Resolution: Ensure that images have the
appropriate resolution for your project. Higher
resolution images are necessary for print, while
web images can be lower in resolution.
• Color Profile: Use the correct color profile (e.g.,
sRGB for web, CMYK for print) to ensure
accurate color representation.

6. Image Integration:
• Layout: Place edited images within your
multimedia project's layout, considering the
overall design and composition.
• Interactivity: If your multimedia project is
interactive, set up any image elements for user
interaction, such as clickable areas or zoom
functionality.

7. Testing:
• Quality Control: Test your multimedia project
to ensure that images display correctly and look
as intended. Check for any visual artifacts,
distortion, or color issues.
8. Export and Delivery:
• Export Settings: When exporting your
multimedia project, use the appropriate settings
to maintain image quality and format
compatibility.
• Delivery Medium: Choose the appropriate
medium for delivering your multimedia project,
whether it's a website, video platform,
presentation software, or print publication.

9. Copyright and Attribution:


• Ensure you have the necessary rights or
permissions for the images you use in your
multimedia project. Properly attribute or license
images if required.

10. Updates and Maintenance:


• Keep your multimedia project up-to-date by
replacing or editing images as needed to reflect
changes or improvements.
3.7 Text File format
Text file formats are used to store and exchange text
data. There are a variety of text file formats available,
each with its own advantages and disadvantages.
Some of the most common text file formats include:
• Plain text (TXT): TXT files are the simplest type of text
file. They contain only text characters, with no
formatting or other special characters. TXT files are
compatible with all major operating systems and
software applications.
• Rich Text Format (RTF): RTF files are more complex
than TXT files. They can contain formatting information,
such as bold, italics, and underlining. RTF files are also
compatible with most major operating systems and
software applications.
• Microsoft Word (DOCX): DOCX files are the native
file format for Microsoft Word. They can contain text,
formatting, images, and other elements. DOCX files are
the most popular text file format for creating and
sharing documents.
• Portable Document Format (PDF): PDF files are
designed to be portable and to be displayed and
printed consistently on any device. PDF files can
contain text, formatting, images, and other elements.
PDF files are also compatible with most major operating
systems and software applications.
Other text file formats include:
• HyperText Markup Language (HTML): HTML files
are used to create web pages. HTML files contain text,
formatting, and hyperlinks.
• Comma-Separated Values (CSV): CSV files are used
to store data in a tabular format. CSV files are often used
to exchange data between different software
applications.
• JavaScript Object Notation (JSON): JSON files are
used to store data in a lightweight and human-readable
format. JSON files are often used to exchange data
between web servers and web applications.

3.8 Digital Imaging


Digital imaging converts printed text, artwork, and
photographs into digital images using a digital scanner
or another imaging device. A particular amount of
pixels or dots are displayed on a screen in each digital
image.
Fig.3.3. Digital imaging

Each pixel is mapped on a grid and saved on a


computer precisely. The tone value of each pixel
determines the hue or color of an image. The value is
encoded in binary code, comprising “bits” of data.

These “bits” of information are read by the computer


and converted to an analog representation of the
image. The amount of pixels per inch determines the
image resolution. The following are different for each
digital image:
• Dynamic range
• Bit depth
• File format
• File size
• Compression
Web pages, multimedia, booklets, graphic
presentations, and more are all created with digital
imagery.

3.8.1 History of Digital Imaging


Before digital imaging, Joseph Nicéphore Niépce of
France created the first photograph, “View” from the
Window at Le Gras, in 1826. When Joseph was 28, he
discussed the possibility of reproducing images with
light with his brother Claude. In 1816, he began
focusing on his new inventions. In truth, he was more
interested in designing a boat engine. Joseph and his
brother concentrated on it for a long time, and Claude
successfully promoted his discovery and advanced him
to England. Finally, Joseph was able to focus on the
shot, and in 1826, he was able to take his first
photograph of a vista from his window. This required at
least 8 hours of light exposure.
The Bartlane cable picture transmission system created
the first digital image in 1920. This approach was
created by British inventors Harry G. Bartholomew and
Maynard D. McFarlane. “A succession of negatives on
zinc plates were exposed for varied durations of time,
thus producing varying densities,” according to the
procedure. The Bartlane cable picture transmission
technology produced a punched data card or tape
reconstructed as an image at both the transmitter and
reception ends.

Russell A. Kirsch created a system in 1957 that


combined a drum scanner and photomultiplier tube to
generate digital data that you could store in a
computer. For military and scientific missions,
including the KH-11 program, digital imaging was
developed in the 1960s and 1970s, partly to overcome
the operational shortcomings of film cameras. As
digital technology became more affordable in the
following decades, it supplanted older film-based
systems for various applications.
Frederick G. Weighart and James F. McNulty (US radio
engineer) at Automation Industries, Inc., then in El
Segundo, California, co-invented the first apparatus to
generate a digital image in real-time. The photo was a
fluoroscopic digital radiography with square wave
signals on the fluorescent screen.

3.8.2 The different aspects of digital imaging are:


1. Resolution: Determines image clarity and
detail. Measured in pixels per inch
(PPI/dpi).
2. Pixel Dimensions: Width and height of an
image in pixels. Calculated by multiplying
PPI/dpi by physical dimensions.
3. Bit Depth: Number of bits per pixel,
affecting color and tone. Black-and-white
uses 1 bit, grayscale uses 2-8 bits, and
color can use 8-24 bits (or more).
4. Dynamic Range: Tonal contrast from light
to dark in an image. Important for image
quality, especially in digital photography.
5. File Size: Calculated by multiplying area,
bit depth, and PPI/dpi². Typically
indicated in bytes.
6. Compression: Methods to reduce file size,
including lossless (no data loss) and lossy
(minimal data loss) compression.
7. File Formats: Define how an image is
stored, including color, resolution,
compression, and metadata. Different
formats suit different needs.

3.8.3 Digital Imaging and Digital Cameras:

Digital cameras have undergone significant


changes in recent years, with high-end options
now being widely used in advertising and
catalog photography. While consumer digital
cameras are becoming more affordable, their
capabilities may not be of interest to those
seeking higher-quality images. However, the
landscape is evolving rapidly, driven by Moore's
Law, which predicts that costs will halve and
performance will double every 18 months. This
means that eventually, more people will have
access to high-quality digital cameras.
Fig. 3.4. Digital Camera

It's essential to recognize that most photographs


we see in print have been digitally modified.
Digital processing has become standard in the
industry, with professionals sending their work
to magazines on CD-R discs. This eliminates the
need for physical transparencies, as a single CD-
R can store numerous high-resolution images,
each tailored to the photographer's
specifications.

This shift to digital imaging has implications for


various photography enthusiasts, including
amateurs, fine artists, and traditional darkroom
workers. In this comprehensive survey, we'll
explore the current state of digital imaging and
its impact on these different groups.

3.8.4 Digital Imaging and Computers:


To engage in digital image processing, you'll
need a powerful computer. Fortunately, modern
computers, priced around $3,000, offer
capabilities beyond serving as a digital
darkroom. For efficient image processing, a
Pentium II 266 MHz CPU is recommended, along
with a minimum of 64MB of RAM (128MB or more
is preferable). A rule of thumb is to have at least
four times the RAM as the file size you're working
with.

Fig. 3.5. Digital Imaging and Computers


Considering the large size of digital image files,
a substantial hard drive, preferably at least 2GB,
is essential. Many use ZIP drives, which can store
up to 100MB, to transmit files to labs and service
bureaus. CD-R drives are cost-effective for
archiving and submissions, with CDs offering
inexpensive storage options.

The display card and monitor screen are crucial.


A 17-inch screen is recommended, and the
graphics card should support at least 1280 x 1024
dpi in 24-bit color.

3.8.5 Digital Imaging and Scanners:


Scanners are essential tools for digitizing slides
and negatives efficiently. While some businesses
offer scanning services, owning a scanner allows
you to distribute costs across numerous
photographs. It also provides greater control
over the scanning process.
Fig. 3.6. Digital Imaging and Scanners
Investing in a high-quality scanner is akin to
buying a top-notch enlarger. Look for scanners
with excellent resolution, as higher resolution
allows for larger prints. Nikon and Polaroid offer
desktop scanners with resolutions of up to 2700
PPI (pixels per inch), sufficient for producing
high-quality images comparable to traditional
photographic media.

Scanner selection also involves evaluating


scanning software, as it plays a crucial role in
image quality and processing.
3.8.6 Digital Imaging and Image
Manipulation:
Once an image is scanned, the real work begins
with digital image processing using software on
your computer. The digital realm offers precise
control over traditional darkroom adjustments,
such as brightness, contrast, color balance,
burning, and dodging. Adobe Photoshop V4.0 is
the industry-standard software for such tasks,
offering unparalleled versatility and power.

Adjustments made during image processing are


immediately visible on-screen, providing real-
time feedback. These modifications are saved as
part of the final image, simplifying future prints.
Digital image processing eliminates the
limitations of the traditional darkroom, offering
a well-lit, well-ventilated workspace without
chemical fumes or dermatological concerns.
right.

3.8.7 Applications of Digital Imaging


Digital imaging has managed to make a mark in almost
every major field in the following ways:

Education
Teachers and students benefit from the enhanced
convenience and communication that digital
projectors, displays, and graphics provide to the
classroom, even though theft is a prevalent problem in
schools. Furthermore, obtaining a fundamental digital
imaging education for young workers is becoming
increasingly vital.

Fig. 3.7. Digital Education – Future of Learning


Medicine
A field of digital imaging that aims to help with disease
detection and treatment is rapidly expanding.
According to the American Academy of Pediatrics,
adequate imaging of children with appendicitis
reduces the number of appendectomies required.
Further developments include exact and accurate
imaging of the brain, lungs, tendons, and other bodily
parts—images that health practitioners may use to
serve patients better.

Fig. 3.8. Medical Imaging


Technology
Image sharpening and restoration is improving
photographs captured by modern cameras or
modifying images to obtain the desired product. This
includes the zooming, blurring, sharpening, grayscale
to color translation, picture recovery, and picture
recognition processes.

Facial Recognition
Face recognition is a computer technology that
determines the positions and sizes of human faces in
self-assuring digital photographs. It identifies face
features and ignores everything else, such as
architecture, trees, and bodies.

Fig. 3.9. Facial Recognition


Remote Sensing
Remote detecting is the acquisition of data about an
article or an incident on a small or large scale using a
recording or continuing detecting apparatus that is not
in direct touch with the article. Remote detecting is, in
practice, face-to-face data collection employing a
variety of devices to collect data about a specific
product or area.

Fig. 3.10. Remote Sensing

Pattern Detection
The study or examination of pattern detection is based
on image processing. In pattern detection, image
processing is used to recognize elements in images,
and then machine learning is used to teach a framework
for pattern variation. Pattern detection is used in
various applications, including computer-aided
analysis, calligraphy detection, picture recognition,
and more.

Fig. 3.11. Pattern Recognition

3.8.8 Developments In Digital Imaging


Historic Digital Image Introduction: The
introduction of the first digital image in 1920 marked
the beginning of a transformative journey in
photography, setting the stage for revolutionary
changes.
Advancements in Equipment: Digital imaging has led
to the development of cheaper, more powerful, and
sleeker hardware, such as cameras, printers, and
scanners, making photography more accessible to
enthusiasts.
User-Friendly Software: The availability of user-
friendly software has empowered both novice and
advanced users to engage in sophisticated digital
imaging on powerful computers.

Internet's Influence: The emergence of the internet


has reshaped photography through online photo-
sharing platforms like Flickr and Instagram, allowing
billions of people to share diverse snapshots of daily
life and expanding the subjects of photography.

Changing Perception: These advancements have


fundamentally altered how we perceive photography
and photographers, moving beyond traditional
portraits to encompass a wide range of visual
narratives.
3.8.9 Advantages of Digital Imaging:
• Digital imaging simplifies the viewing of
images and documents.
• It enables the digitization of books and
documents, making vast libraries
accessible worldwide.
• Medical professionals benefit from
electronic image transfer to consultants
and insurance providers.
• Digital imaging is environmentally
friendly, eliminating the need for chemical
processing.
• It is widely used to archive historical,
scientific, and personal events.
• Digital imaging reduces the physical
handling of original photos.
• It allows for the reconstruction of partially
damaged images without altering the
originals.
• Photographers have more creative
freedom and time with digital imaging.
• Camera phones make digital photography
accessible and instant.
• Digital imaging aids in self-identification
for the younger generation.

3.8.10 Drawbacks of Digital Imaging:


• There is a risk of image manipulation by
editors, photographers, and journalists.
• Staff photographers may become more like
camera operators than photojournalists.
• Editors have greater control over what is
captured.
• Copyright infringement concerns may
arise as digital copying becomes easier.

3.9 Scanning and digital photography in


multimedia
Scanning and digital photography are two essential
methods for capturing visual content that plays a crucial
role in multimedia creation. Each method has its
advantages and use cases within multimedia
production:
3.9.1 Scanning:
1. Use Case: Scanning is typically used when you
want to digitize physical documents, printed
images, artwork, or photographs for multimedia
projects.
2. Equipment: You need a scanner, which is a
device designed to convert physical images or
documents into digital format. Scanners come in
various types, including flatbed scanners for
photos and documents, film scanners for slides
and negatives, and handheld scanners for on-
the-go digitization.
3. Process:
• Place the physical item (such as a photo or
document) face down on the scanner bed.
• Start the scanning software and select the
desired settings (e.g., resolution, color
mode).
• Initiate the scan process, and the scanner
captures the image or document and
converts it into a digital file.
• Save the scanned image in a suitable file
format (e.g., JPEG, TIFF, PDF).
4. Advantages:
• Scanning produces high-resolution, high-
quality digital images suitable for printing
or detailed multimedia projects.
• It is ideal for preserving old photographs,
documents, or artwork in digital form.

3.9.2 Digital Photography:


1. Use Case: Digital photography is used when you
want to capture new images or scenes directly in
a digital format for multimedia projects.
2. Equipment: Digital cameras (including DSLRs,
mirrorless cameras, compact cameras, and
smartphone cameras) are used for capturing
digital photos. Depending on the camera type,
you can have various lens options and settings
for creative control.
3. Process:
• Frame the subject or scene through the
camera's viewfinder or screen.
• Adjust camera settings (e.g., exposure,
ISO, shutter speed, aperture) to achieve
the desired image quality.
• Capture the image by pressing the shutter
button.
• The camera saves the photo as a digital
file (e.g., JPEG, RAW).

4. Advantages:
• Digital photography offers immediate
feedback and allows for real-time
adjustments.
• It is versatile, suitable for capturing a wide
range of subjects, from portraits to
landscapes to action shots.
• Digital cameras often have features like
image stabilization, autofocus, and
various shooting modes for different
situations.
3.9.3 Considerations for Multimedia:
1. Resolution: The resolution of scanned or
digitally captured images should match the
intended use within your multimedia project.
High-resolution images are essential for print,
while web and screen displays may require
lower resolutions.
2. File Format: Choose the appropriate file format
for your images based on the project's
requirements. JPEG is common for web and
multimedia, while TIFF and RAW offer more
editing flexibility.
3. Editing: Regardless of the capture method,
image editing and post-processing (e.g., color
correction, cropping, retouching) are often
necessary to ensure images meet the project's
visual requirements.
4. File Organization: Maintain a well-organized
library of digital images, whether obtained
through scanning or digital photography, to
easily locate and use them in your multimedia
projects.

Both scanning and digital photography are valuable


techniques in multimedia production, and the choice
between them depends on the specific content,
purpose, and desired image quality. Integrating these
methods effectively can enhance the visual appeal and
quality of your multimedia projects.
UNIT IV
4.1 ANIMATION
Animation is the process of creating the illusion of
movement and change in a visual format, typically
using a sequence of images or frames. It is a versatile
art form used in various media, including film,
television, video games, web content, advertising,
and educational materials.

4.1.1 Types of Animation:


• 2D Animation: Traditional hand-drawn
animation involves creating individual frames
on paper or digitally. It is often used in classic
cartoons and some contemporary animation.
• 3D Animation: Computer-generated 3D
animation uses digital models and
environments to create lifelike characters and
scenes. It is common in films, video games, and
architectural visualization.
• Stop Motion Animation: This technique
involves capturing individual frames of physical
objects or puppets, moving them slightly
between frames. It is used in films like "Wallace
and Gromit."
• Claymation: A type of stop motion where
characters and scenes are created from clay or
similar materials.
• Motion Graphics: Combines text, graphics,
and animation to convey information or enhance
visual storytelling. Common in title sequences
and commercials.
• 2.5D Animation: A hybrid technique that
combines elements of 2D and 3D animation to
create the illusion of depth while maintaining a
2D aesthetic.

4.1.2 Principles of Animation:


1. Squash and Stretch: Give objects weight and
flexibility.
2. Anticipation: Prepare the viewer for actions
with visual cues.
3. Staging: Arrange elements for clarity and focus.
4. Straight Ahead and Pose to Pose: Choose
animation methods.
5. Follow Through and Overlapping Action: Add
realism to movement.
6. Slow In and Slow Out (Easing): Create smooth
transitions.
7. Arcs: Use curved paths for natural motion.
8. Secondary Action: Enhance primary actions
with complementary movements.
9. Timing: Control the pace of actions for
believability.
10. Exaggeration: Make movements expressive
and engaging.
11. Solid Drawing: Understand 3D space for
realistic characters.
12. Appeal: Design visually interesting and
memorable characters.

4.1.3 Computer-generated animation


• Animation space.
• Animation techniques.
Animation space
Animation can be rendered in:
• 2-D space - 2-D animations are very simple and
static.
• 2-1/2D space - An illusion of depth is created
through shadowing, highlighting, and forced
perspective, though in reality the image rests in
two dimensions.
• 3-D space - Complicated and realistic animations
are done in 3-D space.

4.1.4 Animation techniques


• Animation process.
• Cel animation.
• Computer animation.

Animation process
The steps to be followed in creating animation are:
• Organize the execution in a series of logical
steps.
• Choose an animation tool best suited for the job.
• Build and tweak the sequences.
• Post-process the completed animation. Cel
animation
• Cel animation is a technique in which a series of
progressively different graphics are used on
each frame of movie film.
• The term "cel" is derived from the clear celluloid
sheets that were used for drawing each frame.
Cel animation
• Cel animation is a technique in which a series of
progressively different graphics are used on
each frame of movie film.
• The term "cel" is derived from the clear celluloid
sheets that were used for drawing each frame.
• Cel animation begins with key frames.
• Keyframes refer to the first and the last frame of
an action.
• The frames in between the keyframes are drawn
in the tweening process.
• Tweening depicts the action that takes place
between keyframes.
• Tweening is followed by the pencil test.
Computer animation.
• Computer animation is very similar to cel
animation.
• The primary difference is in how much must be
drawn by the animator and how much is
automatically generated by the software.
• Kinematics is the study of the movement and
motion of structures that have joints.
• Inverse kinematics is the process of linking
objects, and defining their relationship and
limits.
• Morphing is an effect in which a still or moving
image is transformed into another.

4.2 Kinematics
Kinematics is the study of motion without considering
the forces that cause it. It is a subfield of physics that is
used in a wide variety of applications, including
robotics, animation, and video games.
Multimedia can be used to create effective and
engaging learning experiences for kinematics. For
example, interactive simulations can allow students to
explore different types of motion and see how they are
affected by different parameters. Animations can also
be used to illustrate complex concepts in a clear and
concise way.
Here are some examples of how multimedia can be
used to explain kinematics concepts with diagrams:
• Motion diagrams: Motion diagrams show the position
of an object at different time intervals. They can be used
to illustrate the concepts of displacement, velocity, and
acceleration. For example, The motion of a ball thrown
into the air:
• Velocity vectors: Velocity vectors show the direction
and magnitude of an object's velocity. They can be used
to illustrate the concepts of relative velocity and
acceleration. For example, Velocity vectors of two cars
moving in different directions:
• Free body diagrams: Free body diagrams show all of
the forces acting on an object. They can be used to
analyze the motion of an object and to determine the
forces required to cause it to move in a certain way. For
example, forces acting on a ball thrown into the air:

4.3 Morphing
Morphing is a technique that transforms one image or
shape into another. It is commonly used in movies,
cartoons, and video games to create the illusion of
smooth and fluid motion.

To create a morph, two images or shapes are first


aligned. This means that the corresponding features in
each image are matched up. Once the images are
aligned, a series of in-between images are generated.
These in-between images are a blend of the two
original images, with the percentage of each image
decreasing as the morph progresses.

Morphing can be used to create a variety of effects,


such as:
• Transforming one person into another
• Changing the expression on a face
• Animating objects
• Creating special effects in movies and video
games
• Morphing is a powerful technique that can be
used to create visually stunning and
engaging content.
• Here are some additional examples of
morphing:
• A morph between two images of a car,
showing it transforming from one model to
another.
• A morph between two images of a landscape,
showing it changing from summer to winter.
• A morph between two images of a logo,
showing it evolving over time.
• Morphing can be used to create a wide
variety of effects, and it is a popular
technique in many different industries.

4.4 Animation s/w tools and techniques


There are a variety of animation software tools and
techniques available, depending on the type of
animation you want to create. Some of the most popular
tools include:
• Adobe Animate: A powerful tool for creating 2D
animations, Adobe Animate offers a wide range of
features, including vector drawing, tweening, and lip-
syncing.
• Blender: A free and open-source 3D creation suite,
Blender can be used to create a variety of different
types of animations, including 3D cartoons, motion
graphics, and realistic simulations.
• Autodesk Maya: A professional-grade 3D animation
software, Maya is used by studios all over the world to
create high-quality feature films, TV shows, and video
games.
• Cinema 4D: A versatile 3D animation software,
Cinema 4D is known for its ease of use and its wide
range of features, including procedural modeling,
motion graphics, and character animation.
• Cartoon Animator 4: A specialized 2D animation
software, Cartoon Animator 4 is designed to make it
easy to create realistic-looking 2D animations with
minimal effort.
• Frame-by-frame animation: This traditional
animation technique involves creating each frame of
animation individually. It is a time-consuming process,
but it produces the highest quality animations.
• Tweening: Tweening is a technique that allows you to
create animations by specifying the start and end
positions of an object and the software generates the in-
between frames. This is a much faster technique than
frame-by-frame animation, but it is not as flexible.
• Motion capture: Motion capture is a technique that
records the movements of a real person or object and
then transfers those movements to a digital character.
This is a very popular technique for creating realistic
animations in movies and video games.

4.5 Multimedia video


Video is the technology of electronically capturing,
recording, processing, storing, transmitting, and
reconstructing a sequence of still images representing
scenes in motion.
• Video is an excellent tool for delivering
multimedia.
• Video places the highest performance demand
on computer and its memory and storage.
• Digital video has replaced analog video as the
method of choice for making and delivering
video for multimedia.
• Digital video device produces excellent finished
products at a fraction of the cost of analog.
• Digital video eliminates the image-degrading
analog-to-digital conversion.
• Many digital video sources exist, but getting the
rights can be difficult, time-consuming, and
expensive.
Fig.4.1 Multimedia video representation

4.5.1 Analogue Video


• Video information that is stored using television
video signals, film, videotape or other non-
computer media
• Each frame is represented by a fluctuating
voltage signal known as an analogue wave form
or composite video.
• Composite analogue video has all the video
components:
o brightness, colour and
synchronization
• Then combined into one signal for delivery
Example: traditional television

Problems: colour blending, low clarity, high


generation lost, difficult to edit

4.5.2 Digital Video


Digital video combines features of graphics and audio
to create dynamic content for multimedia products.
• Video is simply moving pictures.
• Digitized video can be edited more easily.
• Digitized video files can be extremely large.
• Digital video is often used to capture content
from movies and television to be used in
multimedia.
• A video source (video camera ,VCR, TV or
videodisc) is connected to a video capture card
in a computer.
• As the video source is played, the analog signal
is sent to the video card and converted into a
digital file (including sound from the video).
• Video clip stored on any mass-storage device
can be played back on a computer’s monitor
without special hardware.
• Setting up a production environment for making
digital video, requires some hardware
specifications.
• Some specifications include computer with
FireWire connection and cables, fast processor,
plenty of RAM, fast and big hard disk.

4.5.3 File Size and Formats


There is an important consideration: file size in
digitized video which included
1. Frame rate
2. Image size
3. Color depth.

1. Frame Rate
• Animation is an illusion caused by the rapid
display of still images.
• Television and movies play at 30 fps but
acceptable playback can be achieved with 15
fps.

Fig.4.2 Frame rate


2. Image Size
• A standard full screen resolution is 640x480
pixels but to safe storing space a video with
320x240 for a computer display is still
acceptable.
• New high-definition televisions (HDTV) are
capable of resolutions up to 1920×1080p60,
• 1920 pixels per scan line by 1080 scan lines,
progressive, at 60 frames per second.
Fig.4.3 Image Size
3. Color quality
• The quality of video is dependent on the color
quality (related to the number of colors) for each
bitmap in the frame sequence.

Fig.4.4 Color Quality


3. Color Depth
• The color depth below 256 colors is poorer-
quality image.
• The frame rate to below 15 fps causes a
noticeable and distracting jerkiness that
unacceptable.
• Changing the image size and compressing the file
therefore become primary ways of reducing file
size.

24-bit 16-bit 8-bit(256 Color)


Fig.4.5 Color Depth

4.5.4 Video Editing Terminology


Non-linear
Refers to the editing of disk-based digital video. The
software provides an on screen map of what the final
video sequences should look like incorporating the
edits, splices, special effects, transitions and sound
tracks
Transitions
Such as fading, wiping, splatters, scrolling, stipple and
many more are available by simply dragging and
dropping that transition between the two video clips
Superimposing
• The ability to superimpose one clip over another
is a
• valuable technique.
• The technique of green screening is identical
except that
• the color green is used for the screen and later
digitally
• removed.
• The blue screen and green screen
superimposing are just two
• of the superimposing technique available.

4.5.5 Video Compression


• The video compression/decompression
programs are used so that video can fit on a
single CD and the speed of transferring video
from a CD to the computer can be increased.
• Let us say that a sequence of 25fps video is about
25MB.
• CD-ROM transfer rate is calculated as follows:
1X= 150KB per second
10X=1.5 MB per second
100X= 15 MB per second

To overcome large video size, CODECS were


developed.

Fig.4.6 Video Compression


• Digital video compression schemes or codecs is
the algorithm used to compress (code) a video
for delivery.
• The codec then decodes the compressed video
in real-time for fast playback.
• Streaming audio and video starts playback as
soon as enough data has transferred to the user’s
computer to sustain this playback.
Two Types of Compression
Lossless compression
Preserves the exact image throughout the
compression and decompression process.
E.g: text images is to identify repeating words and
assign them a code.

Lossy compression
Eliminates some of the data in the image and therefore
provides greater compression ratios than lossless
compression. Applied to video because some drop in
the quality is not noticeable in moving images.
Two types of CODEC (lossy):
Spatial compression
• a digital compression of video data that
compresses the size of the video file by
compressing the image data of each frame
• Compression is done by removing redundancy
from data in the same frame.
Temporal compression
• a digital compression of video data that uses
similarities of sequential frames over time to
determine and store only the image data that
differs from frame to frame.
• Compression is done by removing similarity
between successive video frames

4.6 Broadcast Video Standards


Analog Display Standards
1. National Television Standards Committee
(NTSC):
• These standards define a method for
encoding information into electronic
signal that creates a television picture.
• It has screen resolution of 525 horizontal
scan lines and a scan rate of 30 frames per
second
2. Phase Alternate Line (PAL) and Sequential
Color and Memory (SECAM):
• PAL has a screen resolution of 625
horizontal lines and a scan
rate of 25 frames per second.
• SECAM has a screen resolution of 625
horizontal lines and is a 50 Hz system.
• SECAM differs from NTSC and PAL color
systems in its basic technology and
broadcast method.
3. Advanced Television Systems Committee
(ATSC) Digital Television (DTV):
• This digital standard provides TV stations
with sufficient bandwidth to present four
or five Standard Television (STV) signals
or one High Definition TV (HDTV) signal.
• This standard allows for transmission of
data to computers and for new Advanced
TV (ATV) interactive services.
Fig.4.7 ATSC & DTV
Disadvantages of using Video
• Is expensive to produce
• Requires extensive memory and storage
• Requires special equipment
• Does not effectively illustrate abstract concepts
and static situations

4.7 Digital video production and editing


techniques
Digital video production and editing techniques are
essential skills for creating high-quality video content.
Whether you're producing videos for entertainment,
marketing, education, or any other purpose, mastering
these techniques can significantly improve the final
product. Here are some key steps and techniques
involved in digital video production and editing:

4.7.1 Pre-Production: Planning and Preparation


1. Concept and Scripting:
• Start by developing a clear concept or
idea for your video.
• Write a script that outlines the dialogue,
scenes, and actions.
2. Storyboarding:
• Create a visual storyboard to plan the
sequence of shots and transitions.
3. Casting and Location Scouting:
• Select the talent (actors or presenters) and
scout suitable filming locations.
4. Equipment Setup:
• Choose the right cameras, microphones,
lighting equipment, and accessories for
your project.
4.7.2 Production: Capturing Footage
5. Shooting Techniques:
• Use proper camera techniques, including
framing, composition, and camera
movement (panning, tilting, tracking,
etc.).
6. Lighting:
• Ensure proper lighting to achieve the
desired mood and visibility.
• Use three-point lighting (key light, fill
light, and backlight) for interviews and
staged scenes.
7. Audio:
• Capture clean and high-quality audio.
• Use external microphones, windshields,
and audio recorders if necessary.
• Monitor audio levels and avoid
background noise.
8. Multiple Takes:
• Shoot multiple takes of each scene to
ensure you have options during the
editing phase.
4.7.3 Post-Production: Editing and Enhancement
9. Video Editing Software:
• Choose a video editing software (e.g.,
Adobe Premiere Pro, Final Cut Pro,
DaVinci Resolve) that suits your needs.
10. Import Footage:
• Transfer and organize your video clips in
the editing software's timeline.

11. Editing Techniques:


• Cut, trim, and arrange clips to create a
coherent narrative.
• Use transitions (cuts, fades, dissolves,
etc.) to smooth transitions between shots.
12. Color Correction and Grading:
• Adjust the color balance, contrast, and
saturation to achieve the desired look and
mood.
13. Audio Editing:
• Enhance and clean up audio using
equalization, noise reduction, and audio
effects.
• Add background music and sound effects.
14. Titles and Graphics:
• Incorporate titles, lower thirds, and
graphics to provide context and
information.

15. Special Effects and Visual Effects (VFX):


• Add visual effects, such as motion
graphics, CGI, or compositing, if
required.
16. Transitions and Effects:
• Apply transitions (e.g., crossfades, wipes)
and visual effects to enhance storytelling.

4.7.4 Export and Distribution


17. Export Settings:
• Select the appropriate export settings,
including resolution, frame rate, and file
format.

18. Rendering:
• Render the final video, which may take
some time depending on the project's
complexity.
19. Review and Feedback:
• Review the video with colleagues or
clients for feedback and revisions.
20. Distribution:
• Share the video on platforms such as
YouTube, Vimeo, social media, or your
website.
• Consider optimizing the video for
different platforms and devices.
21. Analytics and Promotion:
• Track the video's performance using
analytics tools and promote it to reach a
wider audience.
22. Archiving and Backup:
• Archive your project files and backup all
assets to prevent data loss.
Digital video production and editing require a
combination of technical skills and creative
storytelling. Continuous learning and practice are
essential for improving your proficiency in these
techniques and producing compelling video content.

4.8 Video File Formats


Video file formats are standardized formats that
determine how video data is compressed, encoded,
and stored in a digital file. Different video file formats
are designed for various purposes, including playback
on different devices, streaming, editing, or archival
purposes. Here are some common video file formats:
MP4 (H.264):
• MP4 (MPEG-4 Part 14) is one of the most widely
used video formats.
• It uses the H.264 video codec, which provides
excellent compression while maintaining high-
quality video.
• MP4 files are compatible with most devices and
platforms, making it ideal for online streaming
and sharing.

AVI (Audio Video Interleave):


• AVI is an older video format developed by
Microsoft.
• It supports various video and audio codecs,
making it versatile but sometimes less efficient
in terms of file size.
• AVI files can be edited easily and are compatible
with many media players.
MKV (Matroska):
• MKV is an open-source multimedia container
format.
• It supports high-quality video and audio codecs,
including H.264, H.265 (HEVC), and various
subtitle formats.
• MKV files are known for their flexibility and are
often used for high-definition content.
MOV (QuickTime):
• MOV is a video format developed by Apple and
is commonly associated with QuickTime.
• It supports various video and audio codecs and
is widely used on Mac devices.
• MOV files are suitable for editing in software like
Final Cut Pro.
WMV (Windows Media Video):
• WMV is a video format developed by Microsoft
for Windows platforms.
• It offers good compression and is suitable for
streaming over the internet.
• WMV files are compatible with Windows Media
Player.
FLV (Flash Video):
• FLV is a video format primarily associated with
Adobe Flash.
• It is commonly used for web-based video
content, including online streaming and video
sharing platforms.
• FLV files can be played with Adobe Flash Player.
WebM:
• WebM is an open and royalty-free video format
developed for web use.
• It uses VP9 or VP8 video codecs and is often used
for HTML5 video playback in web browsers.
• WebM files are known for their high-quality
compression.
MPEG-2:
• MPEG-2 is a standard video format commonly
used for DVDs and some broadcast television.
• It offers good video quality but can result in
larger file sizes compared to more modern
codecs.
MPEG-4:
• MPEG-4 is a versatile video format used for
various purposes, including video streaming,
video conferencing, and mobile devices.
• It supports a wide range of codecs and features.
DivX and Xvid:
• DivX and Xvid are popular video codecs known
for their high compression efficiency.
• These formats are often used for sharing video
content over the internet and can be played with
compatible media players.
UNIT V

5. Multimedia Project
5.1 Stages of Project
Here are the four basic stages in a multimedia project

Planning and costing

Designing and producing


Fig-5.1

Fig.5.1 Stages of multimedia project


Planning and Costing:
● A project always begins with an idea (or) a need
that refine by outlining its messages and
objectives.
● Identify how will make each message and
objective work within authoring system.
● Before begin developing, plan what writing
skills, graphic art, music, video, and other
multimedia expertise will be required.
● Develop a creative graphic look and feel, as well
as a structure and navigation system that will let
the viewer visit the messages and content.
● Estimate the time needed to do all elements and
prepare a budget.
● Work up a short prototype (or) proof-of-concept.
Designing and producing:
Designing and producing Perform each of the
planned tasks to create a finished product. During this
stage, there may be many feedback cycles with a client
until the client is happy.
Testing:
Test your programs to make sure that they meet
the objectives of your project, work properly on the
intended delivery platforms, and meet the needs of
your client or end user.

Delivering:
Package and deliver the project to the end user.
Be prepared to follow up over time with tweaks,
repairs, and upgrades.

5.2 Multimedia Skills


Leonardo da Vinci, the Renaissance man who
was scientist, architect, builder, creative designer,
craftsman, and poet folded into one. To produce good
multimedia, you will need a similar diverse range of
skills—detailed knowledge of computers, text, graphic
arts, sound, and video. These skills, the multimedia
skill set, may be available in a single individual or,
more likely, in a composite of individuals working as a
team. Complex multimedia projects are, indeed, often
assembled by teams of artists and computer
craftspeople, where tasks can be delegated to those
most skilled in a particular discipline or craft. Many job
titles and collaborative team roles for multimedia
development are being adapted to pull from a mix of
motion picture industry, radio and television
broadcasting, and computer software industry
experiences.

5.2.1 Multimedia Team


Often, individual members of multimedia production
teams wear several hats: graphic designers may also
do interface design, scanning, and image processing.
A project manager or producer may also be the video
producer or scriptwriter. Depending upon the scope
and content of your project and the mix of people
required, according to Wes Baker, a professor at
Cedarville University in Cedarville, Ohio, a multimedia
production team may require as many as 8 discrete
roles, including:

Fig-5.2 Multimedia Team


The high skilled Multimedia Project professional can
be classified as

1. Project Manager
2. Multimedia designer
3. Interface designer
4. Writer
5. Video specialist
6. Audio specialist
7. Multimedia programmer
8. Website Producer.

1. Project Manager
● Project Manager is the leader of the project and
it’s responsible for overall development and
implementation of a project as well as day-to-day
operation.
● Project managers are also called program
manager who have the responsibility of the
following areas.
● Design the product.
● Managing the team
1. 1. Design the product
● Design consists of devising a vision for that
product and working out the complete
functionality of the design team.
● The devices complete functional specifications
which are necessary to improve the
performance of the product design.
1. 2. Managing the team
The project manager has a responsibility of,
● Scheduling the task
● Assign the task
● Controlling the activities
● Allocating the resources
● Conducting meetings
● Managing mile stones

1.3. The other responsibilities


Good project managers must completely
understand the strength and limitation of hardware
and software. So that we can make good decision
about what to do what not to do.
2. Multimedia Designer
● A look and feel of a multimedia project should be
pleasing and engaging the screen should
present list of attractive colors, shape and type.
● The multimedia designer should maintain visual
consistency throughout the multimedia project.
● The multimedia designer is more responsible of
overall content of the project.
● He must be able to create a structure for the
content and determines a design elements
required to support the multimedia project.
● He also decides which media is more
appropriate for presentation. The multimedia
designer prepares a blueprint for the entire
project content media and interactions.
● The multimedia designer needs a variety of
skills. He must know how to analyze content
structure and match up with effective
presentable methods.
● He must be an expert on different media types
and capable of media integrated in orders to
create overall design. We should have more
interpersonal skill.
● He must be able to “TALK THE TALK” with all
other team members and clients.
● The designer must understand the capability of
his available resources such as technological
and human and also known when to use such as
resource and stop.
3. Interface Designer
The inter face designer devices the navigation
pathway for the multimedia project. The role of an
interface designer is to create a software device that
organizes the multimedia content to present on the
screen.
A good interface designer creates a product that
rewards more exploration and encourage to the user.
4. Writer
● Creation of characters actions and concepts.
● Creation of interaction dialogue between the
characters.
● Creations proposal writers, text Screen to
deliver manager.
● Writing dialogue and other type of voice based
conversation.
5. Video Specialist
● Responsible for an entire team of video
graphers, sound technicians, lighting designers,
script supervision, production, assistant and
actors.
● He must be a skilled person managing all phases
of production from content to final stage.
● He must understand the potential and limitations
of the medium how these medium with affect the
medium.
● He must decide whether to build a set or fast
shoot without set.
6. Audio Specialist
● Audio specialist elements can make or break
multimedia program come alien.
● He is responsible to designing and producing
music, voice over narrations and sound effects.
● He is responsible for locating and selecting
suitable music for the action.
● He makes schedules for audio recording.
● He is responsible for digitizing the audio into
computer file.
7. Multimedia Programmer
● A multimedia programmers integrated and the
multimedia elements of a project into a seamless
whole using authoring tools and programming
languages such as HTML and scripting language.
● Multimedia programmers must be familiar with
Macintosh and windows platform.
● He must be able to learn and understand the
multimedia project.

8. Website Producer
● The website producers are the programmers
who can design multimedia based website.
Website producers must be able to link and
implement a complex website with many areas
of content and using messages.
● He must interact with all levels of management.
● He should have knowledge of HTML of tables,
flames and forms.
● He must also know the knowledge of scripting
languages and Photoshop.
5.3 Multimedia Authoring
Multimedia authoring is a process of assembling
different types of media contents like text, audio,
image, animations and video as a single stream of
information with the help of various software tools. It
gives the framework for organizing and editing the
components of a multimedia project. It enables the
developer to create interactive presentation by
combining text, audio, video, graphics and animation.

5. 3.1 Features of Authoring Tools


Editing Features- Most authoring environment and
packages exhibit capabilities to create edit and
transform different kinds of media that they support.
For example, Macromedia Flash comes bundled with
its own sound editor. This eliminates the need for
buying dedicated software to edit sound data. So
authoring systems include editing tools to create, edit
and convert multimedia components such as animation
and video clips.
Organizing Features- The process of organization,
design and production of multimedia involve
navigation diagrams or storyboarding and
flowcharting. Some of the authoring tools provide a
system of visual flowcharting or overview facility to
showcase your project's structure at a macro level.
Navigation diagrams help to organize a project. Many
web-authoring programs like Dreamweaver include
tools that create helpful diagrams and links among the
pages of a website.
Visual programming with icons or objects- It is
simplest and easiest authoring process. For example, if
you want to play a sound then just clicks on its icon.
Programming with scripting language- Authoring
software offers the ability to write scripts for software to
build features that are not supported by the software
itself. With script you can perform computational tasks
- sense user input and respond, character creation,
animation, launching other application and to control
external multimedia devices.
Document Development tools- Some authoring tools
offers direct importing of pre-formatted text, to index
facilities, to use complex text search mechanism and to
use hypertext linking tools.
Interactivity Features- Interactivity empowers the
end users to control the content and flow of information
of the project. Authoring tools may provide one or more
levels of interactivity.
Simple branching- Offers the ability to go to another
section of the multimedia production.
Conditional branching- Supports a go to base on the
result of IF-THEN decision or events.
Playback Features- When you are developing
multimedia project, you will continuously assembling
elements and testing to see how the assembly looks and
performs. Therefore authoring system should have
playback facility.
Supporting CD-ROM or Laser Disc Sources- This
software allows over all control of CD-drives and Laser
disc to integrate audio, video and computer files. CD-
ROM drives, video and laserdisc sources are directly
controlled by authoring programs.
Supporting Video for Windows- Videos are the right
media for your project which are stored on the hard
disk. Authoring software has the ability to support more
multimedia elements like video for windows.
Hypertext- Hypertext capabilities can be used to link
graphics, some animation and other text. The help
system of window is an example of hypertext. Such
systems are very useful when a large amount of textual
information is to be represented or referenced.
Cross-Platform Capability- Some authoring
programs are available on several platforms and
provide tools for transforming and converting files and
programs from one to the other.
Run-time Player for Distribution- Run time software
is often included in authoring software to explain the
distribution of your final product by packaging
playback software with content. Some advanced
authoring programs provide special packaging and
run-time distribution for use with devices such as CD-
ROM.
Internet Playability- Due to Web has become a
significant delivery medium for multimedia, authoring
systems typically provide a means to convert their
output so that it can be delivered within the context of
HTML or DHTML.
5.3.2 Authoring Tools Classification

Fig-5.3 Authoring Tools

5.3.2.1Card or Page based authoring tools


In these authoring systems, elements are organized as
pages of a book or a stack of cards. In the book or stack
there are thousands of pages or cards available. These
tools are best used when the bulk of your content
consists of elements that can be viewed individually, for
example the pages of a book or file cards in card file.
You can jump from page to page because all pages can
be interrelated. In the authoring system you can
organize pages or cards in the sequences manner.
Every page of the book may contain many media
elements like sounds, videos and animations.
One page may have a hyperlink to another page that
comes at a much later stage and by clicking on the same
you might have effectively skipped several pages in
between. Some examples of card or page tools are:
● Hypercard (Mac)
● Tool book (Windows)
● PowerPoint (Windows)
● Supercard (Mac)

5.3.2.2 Icon based or Event driven authoring tools
Icon-based tools give a visual programming approach
to organizing and presenting multimedia. First you
build a structure or flowchart of events, tasks and
decisions by dragging appropriate icons from a
library. Each icon does a specific task, for example-
plays a sound, open an image etc. The flowchart
graphically displays the project's logic. When the
structure is built you can add your content text,
graphics, animation, video movies and sounds. A
nontechnical multimedia author can also build
sophisticated applications without scripting using icon
based authoring tools. Some examples of icon based
tools are:
● Authorware Professional (Mac/Windows)
● Icon Author (Windows)

5.3.2.3 Time based authoring tools


Time based authoring tools allow the designer to
arrange various elements and events of the multimedia
project along a well defined time line. By time line, we
simply mean the passage of time. As the time advances
from starting point of the project, the events begin to
occur, one after another. The events may include media
files playback as well as transition from one portion of
the project to another. The speed at which these
transitions occur can also be accurately controlled.
These tools are best to use for those projects, wherein
the information flow can be directed from beginning to
end much like the movies. Some example of Time
based tools are:
● Macromedia's Director
● Macromedia Flash

5.3.2.4 Object-Oriented authoring tools


Object oriented authoring tools support
environment based on object. Each object has the
following two characteristics:
5.3.2.4.1 State or Attributes - The state or attributes
refers to the built in characteristics of an object. For
example, a color T.V has the following attributes:
● Color receiver
● Volume control
● Picture control
● 128 channels
● Remote control unit
5.3.2.4.2 Behaviour or Operations - The behaviour or
operations of an object refers to its action. For example,
a T.V can behave in any of the following manner at a
given point of time:
● Switched on
● Switched off
● Displays picture and sound from
● A TV cable connection
● A TV transmitter
● A DVD
● A VCR
In these systems, multimedia elements events
are often treated as objects that live in a hierarchical
order of parent and child relationships. These objects
use messages passed among them to do things
according to the properties assigned to them. For
example, a video object will likely have a duration
property i.e how long the video plays and a source
property that is the location of the video file. This video
object will likely accept commands from the system
such as play and stop. Some examples of the object
oriented tools are:
● mTropolis (Mac/Windows)
● Apple Media Tool (Mac/Windows)
● Media Forge (Windows)
5.4 PLANNING AND COSTING
5.4.1 The process of making multimedia

Fig. Multimedia Planning


Idea Analysis
• Before beginning a multimedia project, it is
necessary to determine its scope and content.
• Balance is the key principle in idea analysis.
• The aim is to generate a plan of action that will
become the road map for production.
• It is necessary to continually weigh the purpose
or goal against the feasibility and the cost of
production and delivery.
• This can be done dynamically by adding
elements to or subtracting elements from a
project.
• Additive process involves starting with minimal
capabilities and gradually adding elements
• Subtractive process involves discarding
unnecessary
• Idea Analysis elements from a fully developed
project.
Idea analysis involves finding answers to questions
like:
• Who is the intended audience? What are their
needs?
• What multimedia elements will best deliver the
message?
• What hardware, software, and storage capacity
would be required?
• How much time, effort, and money would be
needed?
• How will the final product be distributed?
Project management software includes:
• Microsoft Project.
• Designer's Edge.
• Screenplay System's Screenwriter and Story
View.
• Outlining programs.
• Spreadsheets.
CPM - Project management software typically provides
Critical Path Method (CPM) scheduling functions to
calculate the total duration of a project based upon
each Identified task, showing prerequisites. PERT -
Program Evaluation Review Technique (PERT) harts
provide graphic representations of task relationships.
Gantt charts - depict all the tasks along a timeline
Pre-testing
• Involves defining project goals in fine detail and
spelling out what it will take in terms of skills,
content, and functions to meet these goals.
• Work up a prototype of the project on paper to
help you relate your ideas to the real world.

Task planning
Task planning involves:
• Designing the instructional framework.
• Holding creative idea sessions.
• Determining the delivery platform and authoring
platform.
• Assembling the team.
• Building a prototype, producing audio and
video, testing the functionality, and delivering
the final product.
Development
Prototype development:
• Also known as a proof-of-concept or feasibility
study.
• Involves testing of the initial implementation of
ideas, building mock-up interfaces, and
exercising the hardware platform.
• Trial calculations are possible after prototyping.
• A written report and an analysis of budgets allow
the client some flexibility and also provide a
reality check for developers.
• Alpha development – At this stage, the
investment of effort increases and becomes
more focused. More people get involved.
• Beta development – At this stage, most of the
features of a project are functional. Testing is
done by a wider arena (particular environment)
of testers.
Delivery
• In the delivery stage, the project is said to be
"going gold.”
• The concerns shift towards the scalability of the
project in the marketplace.
Scheduling
• Milestones are decided at this stage.
• The time required for each deliverable, that is
the work products delivered to the client, is
estimated and allocated.
• Scheduling is difficult for multimedia projects
because multimedia creation is basically artistic
trial and error.
• Scheduling is also difficult because computer
hardware and software technology are in
constant flux.
• At this stage, clients need to approve or sign off
on the work created.
• Any revisions of previously approved material
would require a change order.
• A change order stipulates that the additional cost
of revising previously approved material should
be borne by the client.
• When negotiating with a client, limit the number
of revisions allowed.
Estimating
• Cost estimation is done by analyzing the tasks
involved in a project and the people who build
it.
• The hidden costs of administration and
management are also included in the cost
estimates.
• A contingency rate of 10 to 15 percent of the total
cost should be added to the estimated costs.
• Contractors and consultants can be hired, but
they should be billed at a lower rate.
• Ensure that contractors perform the majority of
their work off-site and use their own equipment
to avoid classifying them as employees.
Hardware:
• Hardware is the most common limiting factor for
realizing a multimedia idea.
• List the hardware capabilities of the end-user's
platform.
• Examine the cost of enhancing the delivery
platform.
• The most common delivery platforms require a
monitor resolution of 800X600 pixels and at least
16- bit color depth.
The categories of expenses incurred for producing
multimedia are:
• Project development costs
• Production costs
• Testing costs
• Distribution costs
Project development costs
These include:
• Salaries
• Client meetings
• Acquisition of content
• Communication
• Travel
• Research
• Proposal and contract prep
Production costs
Production costs can further be classified as:
• Management costs
• Content acquisition costs
• Content creation costs
• Graphics production costs
• Audio production costs
• Video production costs
• Authoring costs.
Testing Costs
These include:
• Salaries
• Facility rental
• Printing costs
• Food and incentives
• Coop fees (payment for participation)
• Editing
• Beta program

Distribution costs
These include:
• Salaries
• Documentation
• Packaging
• Manufacturing
• Marketing
• Advertising
• Shipping
RFPs and bid proposals.
• Request for Proposals (RFPs):
• These are formal and detailed documents from
large
• corporations who are "outsourcing" their
multimedia development work.
• They provide information about the scope of
work and the bidding process.
• They are generally not very detailed and
specific.
Bid proposals:
• Should contain an executive summary or an
overview.
• The backbone of the proposal is the estimate and
project plan, which describes the scope of the
work.
• The cost estimates for each phase or deliverable
milestone and the payment schedules should
also be included.
• Should contain the graphic and interactive goals
of the project.
• Prepare a brief synopsis if a project is
complicated.
• Lists the terms and conditions of the contract.
• The terms of a contract should include a
description of the billing rates, Invoicing policy,
third-party licensing fees, and a disclaimer for
liability and damages.
• Design the proposal according to a client's
expectations.
• A proposal should appear plain and simple, yet
businesslike.
• A table of contents or an index is a
straightforward way to present the elements of a
proposal in condensed overview.
• Need analysis and description describes the
reasons the project is being put forward.
• It is necessary to describe the target audience
and the target platform.
• Creative strategy – This section describes the
look and feel of a project. This is useful if the
reviewing executives were not present for the
preliminary discussions.
• Project implementation – This section contains a
detailed calendar, charts, and lists of specific
tasks with associated completion dates,
deliverables, and work hours.
Delivering Testing
• It is important to test and review a project to
ensure that:
• It is bug-free, accurate, and operationally and
visually on target.
• The client's requirements have been met.
• The reputation of the developer/company is not
damaged by a premature or erroneous release.
• Cross-platform issues are addressed by
comprehensive testing on different hardware
and software platforms.
Difficulties in testing:
• The performance of a multimedia project
depends on hardware and software
configurations, and the end-user's connection
speed.
• Few computer configurations are identical.
• The Macintosh environment is sensitive to
certain extensions that conflict with some
software applications
Types of Testing:
• Alpha testing.
• Beta testing.
• Final release.
Alpha Testing
• An alpha release is the first working draft of a project.
• An alpha release of a project is only for internal
circulation.
• Alpha testing is usually done “in-house” by team
members.
• Alpha releases are expected to have problems or to
be incomplete.

Beta Testing
• Beta testing is done with a wider array of testers.
• Beta testers should be representative of real
users.
• These testers should be people who were not
involved with the actual production.
• Beta level bugs are typically less virulent than
alpha bugs.
• Managing beta test feedback is critical.

Final Release
• The terms such as “bronze” or “release
candidate” are used to identify products that are
near completion.
• The final release version is usually called the
“gold master.”
Prerequisites for Delivering a Product
• After a multimedia project is complete, modify
the files so that they can be transferred from the
media to the user’s platform.
• A setup program is required to install a project
on a user’s computer.
• Programs like Mind vision's Installer Vies and
Aladdin's Installer Maker help create installers.
• It is important to provide well-written
documentation about the installation process.
• The documentation must also list potential
problems, constraints, and appropriate warning
messages.
• It is useful to include a file, README.TXT or
ReadMeFirst, on the distribution disc of a
project.
• The file should contain a detailed description of
the installation process.
• The README.TXT document includes a
description of changes or bugs reported since
the documentation was printed.
• Set up a product-related Web site with pages for
registering software, reporting bugs, providing
technical support, and program upgrades.
• Using compression programs.
• Creating file archives.
• Creating self-extracting archives.
Using Compression Programs
• Use a shareware or commercial compression
utility for compressing and decompressing files.
• WinZip, DiscDoubler, and StuffIt Deluxe are
commonly used compression utilities.
• Most compression utilities also provide an
encryption or security feature, which helps hide
classified data.
Creating File Archives
• One or more files of a project can be compressed
into a single file, known as an archive.
• Compressed files take less time to transmit than
uncompressed files.
• When an archive is decompressed, each
individual file in the archive is reconstituted.
• Archives are recognized by their file name
extensions.
• Self-extracting archives are used to deliver
projects on discs in a compressed form.
• Self-extracting files allow a user to run the
executable archive.
• The compressed files are automatically
decompressed and placed on the hard disk.

5.5 Multimedia-Looking Towards Future


The future of multimedia promises exciting
developments and transformations as technology
continues to advance and the ways in which we
consume and interact with media evolve. Here are
some key trends and directions to look for in the future
of multimedia:
5.5.1 Digital Communication and New Media
Digital communication and new media have already
had a profound impact on the way we live, work, and
communicate. In the future, this impact is only likely to
grow.
Some of the key trends in digital communication and
new media that we can expect to see in the coming
years include:
The rise of artificial intelligence (AI): AI is already
being used in a variety of ways in digital
communication and new media, such as in chatbots,
personalized recommendations, and content
generation. In the future, AI is expected to play an even
greater role, enabling more immersive and interactive
experiences.
The growth of immersive technologies such as
virtual reality (VR), augmented reality (AR), and mixed
reality (MR). These technologies are already being
used in a variety of ways, such as in gaming,
entertainment, and education. In the future, they are
expected to become more mainstream and be used in
a wider range of applications.
The rise of social media as a primary source of news
and information for many people. Social media
platforms are also increasingly being used for
businesses and organizations to communicate with
their customers and stakeholders. In the future, social
media is expected to become even more integrated
into our everyday lives.

5.5.2 Interactive television (iTV)


Interactive television is a type of television that allows
viewers to interact with the content they are watching.
This can be done through a variety of methods, such as
remote control, voice commands, and gesture
recognition.
iTV is still in its early stages of development, but it has
the potential to revolutionize the way we watch
television. For example, iTV could allow viewers to:
● Personalize their viewing experience by
choosing which channels and content they want
to see.
● Participate in live events, such as sports games
and concerts.
● Play games and interact with other viewers.

5.5.3 Digital Broadcasting


Digital broadcasting is the transmission of television
and radio signals in a digital format. This offers a
number of advantages over traditional analog
broadcasting, such as improved picture and sound
quality, more channels, and interactive features.

Digital broadcasting is now the standard in most


countries, and it is expected to continue to grow in the
future. As digital broadcasting becomes more
widespread, we can expect to see a wider range of
content and services available to viewers and listeners.

5.5.4 Digital Radio


Digital radio is a type of radio that transmits signals in a
digital format. This offers a number of advantages over
traditional analog radio, such as better sound quality,
more channels, and interactive features.

Digital radio is still in its early stages of development in


some countries, but it is growing rapidly. As digital
radio becomes more widespread, we can expect to see
a wider range of content and services available to
listeners.
5.5.5 Multimedia Conferencing
Multimedia conferencing is a type of conferencing that
allows participants to communicate with each other
using a variety of media, such as audio, video, and text.
This can be done through a variety of platforms, such as
video conferencing software, web conferencing
services, and telepresence systems.

Multimedia conferencing is becoming increasingly


popular in the workplace, as it allows businesses to
reduce travel costs and improve communication and
collaboration between employees in different
locations. In the future, multimedia conferencing is
expected to become even more sophisticated and
widespread.

5.6. Multimedia Design Concept


• Designing a multimedia project requires
knowledge and skill with computers, talent in
graphics, arts, video, and music, and the ability
to conceptualize logical pathways.
• Designing involves thinking, choosing, making,
and doing.
5.6.1 Designing the Structure
• The manner in which project material is
organized has just as great an impact on the
viewer as the content itself.
• Mapping the structure of a project should be
done early in the planning phase.
• Navigation maps are also known as site maps.
• They help organize the content and messages.
• Navigation maps provide a hierarchical table of
contents and a chart of the logical flow of the
interactive interface.
• Navigation maps are essentially non-linear.

There are four fundamental organizing structures:


Linear - Users navigate sequentially, from one frame of
information to another.
Hierarchical - Users navigate along the branches of a
tree structure that is shaped by the natural logic of the
content. It is also called linear with branching.
Non-linear - Users navigate freely through the content,
unbound by predetermined routes.
Composite - Users may navigate non-linearly, but are
occasionally constrained to linear presentations.
• The navigation system should be designed in
such a manner that viewers are given free
choice.
• The architectural drawings for a multimedia
project are storyboards and navigation maps.
• Storyboards are linked to navigation maps
during the design process, and help to visualize
the information architecture.

A user can design their product using two types of


structures:
Depth structure - Represents the complete navigation
map and describes all the links between all the
components of the project.
Surface structure - Represents the structures actually
realized by a user while navigating the depth structure.

Hotspots:
• Add interactivity to a multimedia project.
• The three categories of hotspots are text,
graphic, and icon.
• The simplest hot spots on the Web are the text
anchors that link a document to other documents.
Hyperlinks - A hotspot that connects a viewer to
another part of the same document, a different
document, or another Web site is called a hyperlink.
Image maps - Larger images that are sectioned into
hot areas with associated links are called image maps.
Icons - Icons are fundamental graphic objects
symbolic of an activity or concept.
• Buttons - A graphic image that is a hotspot is
called a button.
• Plug-ins such as Flash, Shockwave, or JavaScripts
enable users to create plain or animated buttons.
• Small JPEG or GIF images that are themselves
anchor links can also serve as buttons on the
Web.
• Highlighting a button is the most common
method of distinguishing it.
• It is essential to follow accepted conventions for
button design and grouping, visual and audio
feedback, and navigation structure.
• Avoid hidden commands and unusual
keystroke/mouse click combinations.

5.6.2 Designing the User Interface


• The user interface of a project is a blend of its
graphic elements and its navigation system.
• The simplest solution for handling varied levels
of user expertise is to provide a modal interface.
• In a modal interface, the viewer can simply click
a Novice/Expert button and change the
approach of the whole interface.
• Modal interfaces are not suitable for multimedia
projects.
• The solution is to build a project that can contain
plenty of navigational power, which provides
access to content and tasks for users at all levels.
• The interface should be simple and user-
friendly.

5.6.3Graphical user interface (GUI)


• The GUIs of Macintosh and Windows are
successful due to their simplicity, consistency,
and ease of use.
• GUIs offer built-in help systems, and provide
standard patterns of activity that produce the
standard expected results.
Graphical approaches that work:
• Plenty of "non-information areas," or white space
in the screens.
• Neatly executed contrasts.
• Gradients.
• Shadows.
• Eye-grabbers.

5.6.4 Graphical approaches to avoid:


• Clashes of color.
• Busy screens.
• Requiring more than two button clicks to quit.
• Too many numbers and words.
• Too many substantive elements presented too
quickly.

5.6.5 Audio interface:


• A multimedia user interface can include sound
elements.
• Sounds can be background music, special
effects for button clicks, voice-overs, effects
synced to animation. Always provide a toggle
switch to disable sound.

You might also like