0% found this document useful (0 votes)
11 views22 pages

SYNCHRONIZATION

Uploaded by

Rahul Dey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views22 pages

SYNCHRONIZATION

Uploaded by

Rahul Dey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

TEMPORAL RELATIONSHIPS

SYNCHRONIZATION CONCEPTS: The


temporal or time relationship is important in
MM presentation. A speaker’s lip movement
& words MUST match in the video clip, i.e.
they must be SYNCHRONIZED, otherwise,
there is a loss of sync.
Classification of Temporal relationship:
Object-Stream interaction: How object streams
interact with each other.
(i) Intra-media relationship: It is interaction
between objects of a single stream. Ex:
Animation without sound. If an animation
clip is stored at 14 frames/sec, then the
clip must be played back at 14 f.p.s. This
is intra-media temporal relationship
constraint. This is also called serial
synchronization.
(ii) Inter-media relationship: It is between
objects of two or more streams. Ex: Video
clip with sound; also called parallel
synchronization.
User-interaction based temporal relation:
In an interactive MM system, it is the
interaction between user & the system.
The time taken by the system to respond
to the user is called response time which
should be consistent & as small as
possible for wither stand alone or
networked interactive MM system. This is
also called event synchronization.

Transportation of a Multimedia stream on a


networked system.
Node N1 Communication Link: L (N1, N2) Node N2

Buffer(s) Stream (A) Am+2 Am+1 Am Buffer(s)


+ +
Processin Processin
g Stream (B) Bn+2 Bn+1 Bn g
(transfor (transfor
- mation)
mation)

Live
Live Input/
Input/ Storage output
output
Storage

Fig: Transportation of MM stream on a


networked MM system
Node N1 is transmitting MM info thru a
communication link (N1, N2) to node N2. Each
node has a buffer of temporary storage of data
i.e. MM objects are retrieved from storage or
received from a live source (say video camera).
MM info is formatted into packets & transmitted
in communication link in form of streams or
packets. Two object streams are shown. Stream
A has m objects.
Stream (A):= (A1, A2,……., Am, Am+1…): =
Ψ(Ap)
Stream B has n objects
Stream (B):= (B1, B2,……., Bn, Bn+1…): =
Ψ(Bq)
Where Ap is any object in stream (A) & Bq is
any object in stream (B) & Ψ(Ap) is the
sequence of type A objects.

2. Based on Media levity


i) Live interaction: Live presentation
demanding minimum end-to-end delay: An
MM presentation requires composition of
MM streams. It has two aspects – spatial
(placement of MM object at space at any
point of time), temporal (takes care of
synchronization between object stream
which are of two types: point & stream
synchronization).
▪ Point synchronization is used
when a single object is in one
stream must be synchronization
with an object in other stream at
one point of time.
▪ Stream synchronization is done
continuously over a period of
time. Stream sync is also called
continuous sync.
ii) Stored interaction: They are retrieved
from secondary storage. Ex: VOD (Video-on-
demand).
iii) Mixed: Combination of live & stored
presentation like collaborative conference.

3. Media Synchronization.
i) Asynchronous: No well defined timing
relationships exists between objects of a
single stream or between objects of
different streams.
ii) Synchronous: Well defined temporal
relationships between objects of different
streams. Ex: Video with sound.
iii) Isochronous: Well defined time relations
between objects of same stream.
Ex: Sound

Object stream Interactions:


The first important aspect of temporal
relationships is based on how MM object
streams interact with one another. This leads to
three types of relationships: intramedia,
intermedia & user interaction based temporal
relationships (Already discussed).
Media Synchronization:
Any MM presentation requires composition of
object streams. It has two aspects: temporal
(relates to synchronization between object
streams) & spatial (relates to placement of MM
objects in space at any point of time)
composition. Two types of synchronization may
be required – point & stream.
Point Synchronization is used when a single
object in one stream must be synchronized with
an object in another stream at one point in time.
Stream synchronization is also called continuous
synchronization as it is done over a period of
time as in continuous synchronization.

Asynchronous Media.
There is NO well defined TIMING
RELATIONSHIP between objects of one or
more object stream(s). This synchrony may be
intrastream or interstream. Example of former is
user typing text on keyboard.

Q: What is point synchronization? What is


stream or continuous synchronization? (3)
Q: What is intramedia (serial) synchronization
and intermedia (parallel) synchronization? Give
examples.
Q: What is synchronous and isochronous media
synchronization?
Lip synchronization is example of intermedia or
parallel synchronization.
Animation clip without sound is example of
serial or intramedia synchronization
SPECIFICATIONS OF TEMPORAL
RELATIONSHIPS.
The temporal relationships between various
objects in MM presentation MUST BE
CLEARLY SPECIFIED either using relative or
absolute temporal specifications.
Relative time specifications: Many MM
presentations do NOT need EXACT
specifications. For ex. for an educational
courseware called “Introduction to MM” a
welcome screen appears & then an introductory
voice message. So, relative specs. are more
important that absolute specs here. The voice
message must play isochronously regardless of
computer speed.
There are seven fundamental Relative Temporal
Relationships (RTRs) between two objects.
There are six inverse relationships, totaling 13
RTRs.
Q: Draw the 7 fundamental RTRs (Relative
temporal Specifications) and derive their
inverses. (5)
Q: How many RTRs are there? 13 (&
fundamental RTR + 6 Inverse RTRs)

Q: What does inverse transform signify in real


time play-out? Rewind
What does SCALING transform signify in real
time play-out? Scaling: fast forward
Shifting: Slow motion

Q: What are the absolute time specifications?


Explain the three types. (3)
A before B
A
B after A
B

A during B
A B encloses A
B

A A ends B
B is ended by
B A

A A meets B
B B meets A

A
A overlaps B
B B overlaps A

A A equals B
B equals A
B

A
A starts B
B B is started by A

The inverse relations are:


(A before B)-1 = B before A =A before -1 B = A after
B
(A meets B)-1 = B meets A =A meets -1 B = A met by
B
and so on….
The inverse transformation is only one type of
transformation, the others being scaling (used
for fast forward) & shifting (used for slow
motion). Inverting is used for Rewind.
Temporal transformations are actually used
to map the stored temporal relationships for
a real time play-out.
Absolute Time Specifications
There are three types.
1. Instant based specification: It is
appropriate when there is a
Need to state that play-out begins exactly
at time t1 & finishes at t3. Values of this
instant may be specified either w.r.t.
reference time i.e. t0=0:00 t1 = 10secs, t2
= 20 secs, etc. or in terms of local time i.e.
t1=05:45 hrs
2. Interval based specification: Gives the
interval for play out. The starting time t1
& play-out interval Tplay are specified. It is
easy to map a fixed interval spec. to a
starting & finishing instant based
specification. It is more important when
inexact spec. is used.
3. Inexact timing specification: It may
sometimes be used in some applications to
specify play-out period. Ex: It will be
appropriate to specify “music should start
soon after still image & as soon s the
music finishes, image must fade out.” A
lower & upper bound on the play-out
period is provided.
Play-out

t0 t1 t2 t3 t4 t

T play (play-out)

t1

T play

play-out)

Lower bound
Upper bound

Spatial Transformation: Transformations taking


place in space i.e. in screen & audio. It includes
transitions on lighting like fading of images or
audio, channel mixing, color changes etc.

SYNCHRONIZATION ACCURACY
SPECIFICATIONS. (SAS).
The factors used to specify SAS are :
• Delay
• Jitter
• Skew
• Error rate
Management of these SAS factors is key to
delivery of good quality MM presentation. The
final aim of the networked MM system is to
deliver required quality of service (QoS).

Delay factors for stored & captured MM objects.


Some delay factors occur only in retrieving
stored objects & some occur only for objects
captured in real time.
Delay factors Stored on disk Captured in
for objects real time
Processing delays; transmitting end
i) Retrieval delays for stored objects

A) Query processing delay Tquery


B) Seek delay for locating data on disk Tseek
C) Access delay for retrieving data Taccess
from disk
ii) Capturing delays for real time objects
A) Sampling delay for real time data Tsample
B) Encoding data for compression Tencode
Networking delays
i) Packetization delay for communication Tpacketize Tpacketize
protocol
ii) Transmission delay over the Ttransmit Ttransmit
communication link
iii) Propagation delay over the communication link Tpropagate Tpropagate
Processing delays: Receiving end
i) Buffering delays art the end nodes Tbuffer Tbuffer
ii) depacketization delay Tdepacketize Tdepacketize
Decoding delay say for decompression Tdecode Tdecode
Presentation delay say, DAC, Raster scanning Tpresent Tpresent
Details of:

1. Processing delays: transmitting End- They


are different for stored objects & for objects
captured in real time. For accessing a stored
object, the delay for retrieving a object is
Tquery which is the time taken to process &
transform a query to the location of objects
bon the storage medium. Tseek is the time
taken to locate the object(s) on the storage
medium. Taccess is the time taken to read &
transfer the object(s) from the medium.
These delays are not applicable to objects
captured in real time.
2. Capturing delays for real time objects: these
delays apply to AV captured in real time.
For digital processing, they MUST be
PASSED thru ADC for sampling at discrete
points in time & quantized as digital values.
Tsample is the time taken to sample a real
time object & performing ADC. The raw
digital signals are not suitable for
transmission. They have to be
encoded/encrypted before transmission,
Tencode includes time delay for the same.
Stored objects are often ready for
transmission. They are captured offline &
encoded before storage. Therefore, the
Tsample & Tencode are avoided for stored
objects.

3. Networking delays: Apply to both stored &


captured objects. Most communication
protocols use packetized data for
transmission. Tpacketize is the time taken to
break the object data to packets suitable for
communication over the network. Tpropagate
is the time delay from the instant the object
is injected in the network on the
transmitting end to the instant at which it
comes out of the network at the receiving
end. Following figure shows four snapshots
of N-bit packet. In the first one, leading
edge of the packet is just touching the
network interface. The time taken by this
packet to cross the network interface is
shown in second snapshot is called
transmission delay. Ttransmit = N bits/R bps.

Propagation delay = Time difference


between first & third snapshots. = Time
difference between second & fourth
snapshots.
Propagation delay depends on type of
switching – circuit, packet or cell switching.
Transmitter Receiver

Network interface

N bits █ t1

█ t2

Communication channel

█ t3

█ t4

D = distance

Fig: Transmission & propagation delays.


Ttransmit = N (bits) /R (bps)
In circuit switching connection, propagation
delay Tpropagate = D (km)/S (bps).
Ttransmit = t2-t1
Tpropagate = t3-t1 = t4-t2.

Processing delays: receiving end.


Data received is buffered. Tbuffer is the delay
added due to buffering of objects at this end.
The received data must be depacketized &
decoded before it can be used for
presentation. Delays are Tdepacketize & Tdecode.
Finally, the received data must be converted
to objects that can be presented on the
workstation. Tpresent is time taken to convert
data received from network into a form
suitable for presentation.
Processing delay – Receiving End
a) Buffering delay at nodes Tbuffer Tbuffer
b) Depacketization delay Tdepacketize Tdepacketize
c) Decoding delay for compression Tdecode Tdecode
d) Presentation delay Tpresent Tpresent
(DAC, Raster Scan etc.)

Latency or total delay


LTstored object = Tquery + Tseek + Taccess
+ Tpacketize + Ttransmit + Tpropagate
+ Tbuffer+ Tdepacketize + Tdecode + Tpresent
LTrealtime object = Tsample + Tencode
+ Tpacketize + Ttransmit + Tpropagate
+ Tbuffer+ Tdepacketize + Tdecode + Tpresent
JITTER

Reference
clock

C1 C2 C3 C4 C5 C6 C7 C8

Stream A
objects

A1 A2 A3 A4 A5 A6 A7 A8

Fig: Intra-stream jitter

Reference
clock

C1 C2 C3 C4 C5 C6 C7 C8

Stream A
objects

A1 A2 A3 A4 A5 A6 A7 A8

Stream B
objects

B1 B2 B3 B4 B5 B6 B7 B8

Fig: Inter-stream jitter


Due to variations in delay factors on networked
MM systems, the temporal relationships vary
from their desired values. In packet switching
networks, different packets may take different
paths through the network leading to variations
in their end-to-end delay. This variation is called
JITTER. It is the instantaneous difference
between the desired presentation times & actual
presentation times of streamed MM objects. It
can be
1. Intramedia - that occurs in a single object
stream. The expected presentation times are
determined by the reference clock. Actual
presentation times of the consecutive objects
are displaced w.r.t. reference clock. The
deviations in presentation times are
instantaneous & NOT cumulative. It is
instantaneous asynchrony in isochronous
object streams. Intramedia Jitter in audio
leads to a quivering voice. A picture without
sound jumps.
2. Intermedia – that which occurs between two
object streams. Consequence of intermedia
jitter between video with audio will be loss
of lip sync along with shaky picture.
SKEW
Reference
clock

C1 C2 C3 C4 C5 C6 C7 C8

Stream A
objects

A1 A2 A3 A4 A5 A6 A7 A8

Intra-media Skew.
Reference
clock

C1 C2 C3 C4 C5 C6 C7 C8

Stream A
objects

A1 A2 A3 A4 A5 A6 A7 A8

Stream B
objects

B1 B2 B3 B4 B5 B6 B7 B8
Skew is also caused by variations in end-to-end
delay in object streams. In a skewed
presentation, the average rate of object delivery
deviates from the desired average rate. It is the
AVERAGE DIFFERENCE between the desired
presentation rate & actual presentation rate of
streamed MM objects. It can be intramedia or
intermedia. The actual presentation times of the
consecutive objects are displaced w.r.t. the
system reference clock which keep on increasing
& have a cumulative effect. Effect of intramedia
skew shall be either a slow or fast moving
animation clip for an animation video or for a
sound track may have a higher or lower pitched
voice. Effect of intermedia skew may be
complete loss of sync for audio video clip.

Q: Give the difference between jitter and skew.


(4)
Q: Give example of intramedia jitter: Ans:
Quivering voice.

Q: Give example of intermedia jitter: an audio-


video clip will have shaky picture and loss of
lip-sync
Q: Give example of intramedia skew: slow or
fast animation speed, high or low pitched sound

Q:Example of intermedia skew: A/V with


complete loss of lip sync.

Q: What are the delay factors for objects (5)


i) Stored on disk
ii)Captured in real time
What is the total delay or latency time for both?
Give the answer in a table form as in notes. It
will be easier.

You might also like