0% found this document useful (0 votes)
45 views7 pages

Video (Unit - 4)

Uploaded by

infotech.saasc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views7 pages

Video (Unit - 4)

Uploaded by

infotech.saasc
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

VIDEO

 Since the first silent film movie flickered to life, people have been fascinated with
“motion” pictures.
 To this day, motion video is the element of multimedia that can draw gasps from a
crowd at a trade show or firmly hold a student’s interest in a computer-based learning
project.
 Digital video is the most engaging of multimedia venues, and it is a powerful tool for
bringing computer users closer to the real world. It is also an excellent method for
delivering multimedia to an audience raised on television.
 But take care! Video that is not thought out or well produced can degrade your
presentation.

USING VIDEO :

 Carefully planned, well-executed video clips can make a dramatic difference in a


multimedia project.
 Before deciding whether to add video to your project, however, it is essential to have
an understanding of the medium, its limitations, and its costs. Of all the multimedia
elements, video places the highest performance demand on your computer or
device—and its memory and storage.
 Consider that a high-quality color still image on a computer screen could require as
much as a megabyte or more of storage memory.
 Multiply this by 30—the number of times per second that the picture is replaced to
provide the appearance of motion—and you would need at least 30 megabytes of
storage to play your video for one second, more than 1.8 gigabytes of storage for a
minute, and 108 gigabytes or more for an hour.
 Some of the hottest and most arcane multimedia technologies and research efforts
have dealt with compressing digital video image data into manageable streams of
information.
 Compression (and decompression), using special software called a codec, allows a
massive amount of imagery to be squeezed into a comparatively small data file, which
can still deliver a good viewing experience on the intended viewing platform during
playback.
 you can design a project to meet a specific compression standard, such as MPEG2 for
Digital Versatile Disc (DVD) playback or MPEG4 for home video.
 You can install a superfast RAID (Redundant Array of Independent Disks) system
that will support high-speed data transfer rates.
 You can include instructions in your authoring system that will spool video clips into
RAM, ready for high-speed playback before they need to play.
HOW IT WORKS :

 When light reflected from an object passes through a video camera lens, that light is
converted into an electronic signal by a special sensor called a charge-coupled device
(CCD).
 Top-quality broadcast cameras and even camcorders may have as many as three
CCDs (one for each color of red, green, and blue) to enhance the resolution of the
camera and the quality of the image.
 It’s important to understand the difference between analog and digital video.
 Analog video has a resolution measured in the number of horizontal scan lines (due
to the nature of early cathode-tube cameras), but each of those lines represents
continuous measurements of the color and brightness along the horizontal axis, in a
linear signal that is analogous to an audio signal.
 Digital video signals consist of a discrete color and brightness (RGB) value for each
pixel. Digitizing analog video involves reading the analog signal and breaking it into
separate data packets. This process is similar to digitizing audio, except that with
video the vertical resolution is limited to the number of horizontal scan lines.

Analog Video :

 In an analog system, the output of the CCD is processed by the camera into three
channels of color information and synchronization pulses (sync) and the signals
are recorded onto magnetic tape.
 There are several video standards for managing analog CCD output, each dealing
with the amount of separation between the components—the more separation of
the color information, the higher the quality of the image.
 If each channel of color information is transmitted as a separate signal on its own
conductor, the signal output is called component (separate red, green, and blue
channels), which is the preferred method for higher-quality and professional video
work.
 Lower in quality is the signal that makes up Separate Video (S-Video), using two
channels that carry luminance and chrominance information.
 The least separation (and thus the lowest quality for a video signal) is composite,
when all the signals are mixed together and carried on a single cable as a
composite of the three color channels and the sync signal.
 The analog video and audio signals are written to tape by a spinning recording
head that changes the local magnetic properties of the tape’s surface in a series of
long diagonal stripes. Because the head is canted or tilted at a slight angle
compared with the path of the tape, it follows a helical (spiral) path, which is
called helical scan recording.
 .Tracking is the fine adjustment of the tape during playback so that the tracks are
properly aligned as the tape moves across the playback head.
 The signal is modulated on either Channel 3 or Channel 4, and the resulting signal is
demodulated by the TV receiver and displayed on the selected channel. Many
television sets today also provide a composite signal connector, a S-Video connector,
and a High-Definition Multimedia Interface (HDMI) connector for purely digital
input.
 Video displays for computers typically provide analog component (red, green, blue)
input through a 15-pin VGA connector and also a purely digital Digital Visual
Interface (DVI) and/or an HDMI connection.

BROADCAST VIDEO STANDARDS :

 Three analog broadcast video standards are commonly in use around the world:
NTSC, PAL, and SECAM.
 In the United States, the NTSC standard has been phased out, replaced by the ATSC
Digital Television Standard.
 Because these standards and formats are not easily interchangeable, it is important to
know where your multimedia project will be used.
 A video cassette recorded in the United States (which uses NTSC) will not play on a
television set in any European country (which uses either PAL or SECAM).

NTSC :

 The United States, Canada, Mexico, Japan, and many other countries used a
system for broadcasting and displaying video that is based upon the specifications
set forth by the 1952 National Television Standards Committee (NTSC).
 These standards define a method for encoding information into
electronic signalthat creates a television picture.
 It has screen resolution of 525 horizontal scan lines and a scan rate of 30
frames per second.
PAL
 The Phase Alternate Line (PAL) system was used in the United Kingdom,
Western Europe, Australia, South Africa, China, and South America.
 PAL has a screen resolution of 625 horizontal lines and a scan rate of 25
frames per second.
SECAM
 The Sequential Color and Memory system was used in France, Eastern
Europe, the former USSR, and a few other countries.

 SECAM has a screen resolution of 625 horizontal lines and is a 50 Hzsystem.


 SECAM differs from NTSC and PAL color systems in its basic technology and
broadcast method.
DIGITAL VIDEO :
 In digital systems, the output of the CCD is digitized by the camera into a sequence of
single frames, and the video and audio data are compressed before being written to a
or digitally stored to disc or flash memory in one of several proprietary and competing
formats.
 Digital video data formats, especially the codec used for compressing and
decompressing video (and audio) data, are important.

HDTV :
 What started as the High Definition Television (HDTV) initiative of the Federal
Communications Commission in the 1980s changed first to the Advanced
Television (ATV) initiative and then finished as the Digital Television (DTV)
initiative by the time the FCC announced the change in 1996.
 This standard, which was slightly modified from both the Digital Television
Standard (ATSC Doc. A/53) and the Digital Audio Compression Standard (ATSC
Doc. A/52).
 It also provided TV stations with sufficient bandwidth to present four or five
Standard Television signals or one HDTV signal.
 HDTV provides high resolution in a 16:9 aspect ratio . This aspect ratio allows
the viewing of Cinemascope and Panavision movies.
 The broadcast industry promulgated an ultra-high-resolution, 1920 × 1080
interlaced format (1080i) to become the cornerstone of the new generation of
high-end entertainment centers, but the computer industry wanted a 1280 × 720
progressive-scan system (720p) for HDTV.
 While the 1920 × 1080 format provides more pixels than the 1280 × 720
standard. Both formats have been included in the HDTV standard by the
Advanced Television Systems Committee (ATSC).

SHOOTING AND EDITING VIDEO :


 Before you head out to the field with your camcorder in hand, it is important to
understand at least the basics of video recording and editing, as well as the constraints
of using video in a multimedia project.
 Setting up a production environment for making digital video requires hardware that
meets minimum specifications for processing speed, data transfer, and storage.
 There are many considerations to keep in mind when setting up your production
environment, depending on the capabilities of your camcorder:
■ Fast processor(s)
■ Plenty of RAM
■ Computer with FireWire (IEEE 1394 or i.Link) or USB connection
and cables
■ Fast and big hard disk(s)
■ A second display to allow for more real estate for your editing
software
■ External speakers
■ Nonlinear editing (NLE) software
 You can do a great deal of satisfactory work with consumer-grade video cameras and
recording equipment if you understand the limitations of the technology.

INTEGRATING COMPUTER AND TELEVISION :


 Colored phosphors on a cathode ray tube (CRT) screen glow red, green, or blue when
they are energized by an electron beam. Because the intensity of the beam varies as it
moves across the screen, some colors glow brighter than others.
 If a computer displays a still image or words onto a CRT for a long time without
changing, the phosphors will permanently change, and the image or words can
become visible, even when the CRT is powered down. Screen savers were invented to
prevent this from happening.
 Flat screen displays are all-digital, using either liquid crystal display (LCD) or plasma
technologies.
 Full integration of digital video in cameras and on computers eliminates the analog
television form of video, from both the multimedia production and the delivery
platform.
 If your video camera generates a digital output signal, you can record your video
direct-to-disk, where it is ready for editing.
 If a video clip is stored as data on a hard disk, CD-ROM, DVD, or other mass-storage
device, that clip can be played back on a computer’s monitor without special
hardware.
Interlacing and Progressive Scan
Overscan and the Safe Title Area
Interlacing and Progressive Scan :
 The process of building a single frame from two fields is called interlacing, a
technique that helps to prevent flicker on CRT screens. Computer monitors use a
different progressive-scan technology, and draw the lines of an entire frame in a
single pass, without interlacing them and without flicker.
 In television, the electron beam actually makes two passes on the screen as it draws a
single video frame, first laying down all the odd-numbered lines, then all the even-
numbered lines, as they are interlaced.
 Most computers today provide video outputs to CRT, LCD, or plasma monitors at
greater than 1024 × 768 resolution. The VGA’s once ubiquitous 640 × 480 screen
resolution is again becoming common for handheld and mobile device displays.
Overscan and the Safe Title Area :
 it is common practice in the television industry to broadcast an image larger than will
fit on a standard TV screen so that the “edge” of the image seen by a viewer is always
bounded by the TV’s physical frame, or bezel. This is called overscan.
 In contrast, computer monitors display a smaller image on the monitor’s picture tube
(underscan), leaving a black border inside the bezel.
 Video editing software often will show you the safe areas while you are editing.

RECORDING FORMATS :
 Be prepared to produce more than one version of your video (codecs in a container) to
ensure that the video will play on all the devices and in all the browsers necessary for
your project’s distribution.
 DVD video uses MPEG-2 compression. Blu-ray video uses MPEG-4 AVC
compression. These are known standards and few choices are necessary: simply click
“Save for DVD” or “Save for Blu-ray.”
 But if you need to prepare a video file that will run on an iPod, a Droid, and an Atom-
based netbook, as well as in all web browsers, you will need to convert your material
into multiple formats.
 There are many free, shareware, and inexpensive file format converters available for
multiple platforms.

You might also like