0% found this document useful (0 votes)
42 views5 pages

Unit-5 Audio Video Straming

Kindly check

Uploaded by

226y1a0567
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views5 pages

Unit-5 Audio Video Straming

Kindly check

Uploaded by

226y1a0567
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

AUDIO/VIDEO STREAMING

INTRODUCTION
We can divide audio and video services into three broad categories: streaming stored
audio/video, streaming live audio/video, and interactive audio/video. Streaming means a user can
listen (or watch) the file after the downloading has started.

In the first category, streaming stored audio/video, the files are compressed and stored on a
server. A client downloads the files through the Internet. This is sometimes referred to as on-demand
audio/video. In the second category, streaming live audio/video refers to the broadcasting of radio
and TV programs through the Internet. In the third category, interactive audio/video refers to the use
of the Internet for interactive audio/video applications. A good example of this application is Internet
telephony and Internet teleconferencing.

1. STREAMING STORED AUDIO/VIDEO


Downloading these types of files from a server can be different from downloading other types of
files.
1.1. First Approach: Using a Web Server
A compressed audio/video file can be downloaded as a text file. The client (browser) can use the
services of HTTP and send a GET message to download the file. The Web server can send the
compressed file to the browser. The browser can then use a help application, normally called a media
player, to play the file. The file needs to download completely before it can be played.

1.2. Second Approach: Using a Web Server with Metafile


In another approach, the media player is directly connected to the Web server for downloading the
audio/video file. The Web server stores two files: the actual audio/video file and a metafile that holds
information about the audio/video file.
1. The HTTP client accesses the Web server using the GET message.
2. The information about the metafile comes in the response.
3. The metafile is passed to the media player.
4. The media player uses the URL in the metafile to access the audio/video file.
5. The Web server responds.
Third Approach: Using a Media Server
The problem with the second approach is that the browser and the media player both use the
services of HTTP. HTTP is designed to run over TCP. This is appropriate for retrieving the metafile, but
not for retrieving the audio/video file. The reason is that TCP retransmits a lost or damaged segment,
which is counter to the philosophy of streaming. We need to dismiss TCP and its error control; we
need to use UDP. However, HTTP, which accesses the Web server, and the Web server itself are
designed for TCP; we need another server, a media server.

6. The HTTP client accesses the Web server using a GET message.
7. The information about the metafile comes in the response.
8. The metafile is passed to the media player.
9. The media player uses the URL in the metafile to access the media server to download the file.
Downloading can take place by any protocol that uses UDP.
10. The media server responds.

1.3. Fourth Approach: Using a Media Server and RTSP


The Real-Time Streaming Protocol (RTSP) is a control protocol designed to add more
functionalities to the streaming process. Using RTSP, we can control the playing of audio/video. Figure
5 shows a media server and RTSP.
1. The HTTP client accesses the Web server using a GET message.
2. The information about the metafile comes in the response.
3. The metafile is passed to the media player.
4. The media player sends a SETUP message to create a connection with the media server.
5. The media server responds.
6. The media player sends a PLAY message to start playing (downloading).
7. The audio/video file is downloaded using another protocol that runs over UDP.
8. The connection is broken using the TEARDOWN message.
9. The media server responds.

2. STREAMING LIVE AUDIO/VIDEO


Streaming live audio/video is similar to the broadcasting of audio and video by radio and TV
stations. Instead of broadcasting to the air, the stations broadcast through the Internet. There are
several similarities between streaming stored audio/video and streaming live audio/video. They are
both sensitive to delay; neither can accept retransmission. However, there is a difference. In the first
application, the communication is unicast and on-demand. In the second, the communication is
multicast and live. Live streaming is better suited to the multicast services of IP and the use of
protocols such as UDP and RTP.
Examples: Internet Radio, Internet Television (ITV), Internet protocol television (IPTV)

3. REAL-TIME INTERACTIVE AUDIO/VIDEO


In real-time interactive audio/video, people communicate with one another in real time. The
Internet phone or voice over IP is an example of this type of application. Video conferencing is another
example that allows people to communicate visually and orally.
Before discussing the protocols used in this class of applications, we discuss some characteristics
of real-time audio/video communication.
 Time Relationship
Real-time data on a packet-switched network require the preservation of the time relationship
between packets of a session.

But what happens if the packets arrive with different delays? For example, the first packet arrives
at 00:00:01 (1-s delay), the second arrives at 00:00:15 (5-s delay), and the third arrives at 00:00:27 (7-s
delay). If the receiver starts playing the first packet at 00:00:01, it will finish at 00:00:11. However, the
next packet has not yet arrived; it arrives 4 s later. There is a gap between the first and second packets
and between the second and the third as the video is viewed at the remote site. This phenomenon is
called jitter. Jitter is introduced in real-time data by the delay between packets.

 Timestamp
One solution to jitter is the use of a timestamp. If each packet has a timestamp that shows the
time it was produced relative to the first (or previous) packet, then the receiver can add this time
to the time at which it starts the playback. Imagine the first packet in the previous example has a
timestamp of 0, the second has a timestamp of 10, and the third a timestamp of 20. If the receiver starts
playing back the first packet at 00:00:08, the second will be played at 00:00:18, and the third at
00:00:28. There are no gaps between the packets.
 Playback Buffer
To be able to separate the arrival time from the playback time, we need a buffer to store the data
until they are played back. The buffer is referred to as a playback buffer. In the previous example, the
first bit of the first packet arrives at 00:00:01; the threshold is 7 s, and the playback time is 00:00:08.
The threshold is measured in time units of data. The replay does not start until the time units of data are
equal to the threshold value.

 Ordering
We need a sequence number for each packet. The timestamp alone cannot inform the receiver if a
packet is lost.

 Multicasting
Multimedia play a primary role in audio and video conferencing. The traffic can be heavy, and the
data are distributed using multicasting methods. Conferencing requires two-way communication
between receivers and senders.

 Mixing
If there is more than one source that can send data at the same time (as in a video or audio
conference), the traffic is made of multiple streams. Mixing means combining several streams of
traffic into one stream.

You might also like