0% found this document useful (0 votes)
43 views

AVL Daraxti

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

AVL Daraxti

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 58

AVL Trees

(Algorithms and Data


Structures)
AVL Trees
• it is a balanced data structure invented back in 1962 by Adelson-
Velsky and Landis (AVL)
• this data structure has a guaranteed O(logN) running time
• the running time of binary search trees depends on the h height of
the binary search tree
• in an AVL tree the heights of the two child subtrees of any node differ
by at most one
• AVL trees are faster than red-black trees because they are more
rigidly balanced but needs more work
• operating systems relies heavily on these data structures
AVL Trees

AVL TREES RED-BLACK TREES

AVL trees are rigidly balanced this is why red-black trees are faster to construct because
O(logN) running time is guaranteed they are not as balanced as AVL trees
(it is as fast as a binary search tree can be) (but it is not as fast as AVL trees)
AVL Trees
• all the operations are the same as we have seen with binary search
trees (insertion and removal)
• after every insertion and removal operations we have to check
whether the tree has become imbalanced or not
• if the tree is imbalanced then we have to make rotations
AVL Trees
AVERAGE-CASE WORST-CASE
space complexity O(N) O(N)
insertion O(logN) O(logN)
deletion (removal) O(logN) O(logN)
search O(logN) O(logN)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the


longest path from the 12
actual node to a leaf node
4 20

5 23
1 the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the


longest path from the 12
actual node to a leaf node
4 20
0
0 0 23
1 5
the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the


longest path from the 12
actual node to a leaf node 1
1
4 20
0
0 0 23
1 5
the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the 2


longest path from the 12
actual node to a leaf node 1
1
4 20
0
0 0 23
1 5
the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the 2


longest path from the 12
actual node to a leaf node 1
1
4 20
0
0 0 23
1 5
the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the 2


longest path from the 12
actual node to a leaf node 1
1
4 20
0
0 0 23
1 5
the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees
height = max( left child’s height , right child’s height ) + 1

the height of a node is the 2


longest path from the 12
actual node to a leaf node 1
1
4 20
0
0 0 23
1 5
the height of a NULL node
is -1 to be consistent
(this is why leaf nodes have height 0)
AVL Trees

12 12

4 4 20

5 23
1 1

IMBALANCED TREE BALANCED TREE

in an imbalanced tree the running time of in a balanced tree the running time of
operations can be reduced to even O(N) linear operations are O(logN) always
Running time complexity
AVL Trees

2 2
12 12

-1 1
4 1 NULL 4 1 20
0
0 0 23
1 0 1 5

IMBALANCED TREE BALANCED TREE

in an imbalanced tree the running time of in a balanced tree the running time of
operations can be reduced to even O(N) linear operations are O(logN) always
Running time complexity
AVL Trees
• AVL trees are exactly the same as binary search trees
• the only difference is that we track the h heigth parameters of the
nodes in the tree
all subtrees height parameter can not
| hleft – hright | > 1 differ more than 1
(otherwise the tree is imbalanced)

• we have to update the binary search tree and make rotations if it gets
imbalanced
• this is why we have the h height parameters – we just have to check
the differences in height parameters after every operation
AVL Trees
• this is a left-heavy case when the left
subtree contains more nodes
2 • in this case the height of the root node is 2
12
so it is a positive value
-1
4 1 NULL • usually the difference in the height
pramateres is called balance factor
0 0
1 5
hleft – hright
positive balance factors mean
left-heavy cases
AVL Trees
• this is a right-heavy case when the right
subtree contains more nodes
2 • in this case the height of the root node is -2
12
so it is a negative value
-1 1 • usually the difference in the height
NULL 20
pramateres is called balance factor
0
23

hleft – hright
negative balance factors mean
right-heavy cases
AVL Trees
(Algorithms and Data
Structures)
AVL Trees Rotations
• we have to track the h height parameters for all the nodes in the binary
search tree
• we can calculate the balance factors for the nodes
• have to make rotations if necessary to rebalance search trees

1.) LEFT ROTATIONS:


Negative balance factors means right heavy situation so we have to make a
left rotation to rebalance the tree
2.) RIGHT ROTATION
Positive balance factors means left heavy situation so we have to make a
right rotation to rebalance the tree
AVL Trees Rotations

D
B
right rotation
B E
A D

A C left rotation C E
AVL Trees Rotations

• this is when we apply right rotation on the


root node (with value 5)
• we have to update the references for several
nodes in O(1) running time
• the binary search tree properties remains
the same (parent-child relationships)
• the in-order traversal is the same after
rotation (extremely crucial)
AVL Trees Rotations

• this is when we apply left rotation on the


root node (with value 3)
• we have to update the references for several
nodes in O(1) running time
• the binary search tree properties remains
the same (parent-child relationships)
• the in-order traversal is the same after
rotation (extremely crucial)
AVL Trees Rotations
ROTATION CASE 1

C
AVL Trees Rotations
ROTATION CASE 1

2
A
-1
NULL
B 1

positive balance factors mean


0 left-heavy cases
C
AVL Trees Rotations
ROTATION CASE 1

C A
AVL Trees Rotations
ROTATION CASE 2

E
AVL Trees Rotations
ROTATION CASE 2
2

1
-1
NULL D
0
negative balance factors mean
right-heavy cases E
AVL Trees Rotations
ROTATION CASE 2

B E
AVL Trees Rotations
ROTATION CASE 3
D

C
AVL Trees Rotations
ROTATION CASE 3 2

-1
NULL
B
1

0 C
AVL Trees Rotations
ROTATION CASE 3
D

B
AVL Trees Rotations
ROTATION CASE 3

B D
AVL Trees Rotations
ROTATION CASE 4

E
AVL Trees Rotations
ROTATION CASE 4
2

-1 1
NULL F

0
E
AVL Trees Rotations
ROTATION CASE 4

F
AVL Trees Rotations
ROTATION CASE 4

B F
AVL Trees Rotations
• rotations are extremely fast – we just have to update the references in
O(1) constant running time
• this operation does not change the properties of the tree
• the in-order traversal remains the same as well as the parent-child
relationships in the tree
• THERE MAY BE OTHER ISSUES BECAUSE OF ROTATIONS
• we have to check up to the root node whether to make further
rotations or not – it takes O(logN) running time
AVL Trees
(Algorithms and Data
Structures)
AVL Trees
32

10 55

1 19 41

16
AVL Trees
INSERT(12)

32

10 55

1 19 41

16
AVL Trees
INSERT(12)

32

10 55

1 19 41

16
AVL Trees
INSERT(12)

32

10 55

1 19 41

16
AVL Trees
INSERT(12)

32

10 55

1 19 41

16
AVL Trees
INSERT(12)

32

10 55

1 19 41

16
AVL Trees
INSERT(12)

32

10 55

1 19 41

16
AVL Trees
32

10 55

1 19 41

16

12
AVL Trees
32

10 55

1 19 41

16

0
12
AVL Trees
32

10 55

1 19 41

16 1

0
12
AVL Trees
32

10 55

1 19 2 41

16 1

0
12
AVL Trees
32

10 55

1 19 2 41

-1
16 1 NULL

0 positive balance factors mean


12
left-heavy cases
AVL Trees
32

10 55

1 19 2 41

-1
16 1 NULL

0 positive balance factors mean


12
left-heavy cases
AVL Trees
32

10 55

1 16 41

12 19
AVL Trees
3
32

2
1
10 55

1 16 1 41 0

0
0 0
12 19
AVL Trees
(Algorithms and Data
Structures)
Balanced Binary Trees
Applications
1.) GAME ENGINES

Almost all the game engines use binary


search trees to determine which objects
should be rendered in the game world

(it was first used in Doom back in 1993)


Balanced Binary Trees
Applications
2.) COMPILERS

It is crucial in programming languages to be able to parse syntax


expressions (loops or statements).

This is why compilers use a special binary search tree


the so-called syntax tree
Balanced Binary Trees
Applications
3.) AVL SORT

We can use balanced tree data structures to sort items efficiently in


O(NlogN) linearithmic running time complexity

• we have to insert the N items into an AVL tree


• then we just have to make a simple in-order traversal in O(N) linear
running time
• final running time is O(NlogN)+O(N)=O(NlogN)
Balanced Binary Trees
Applications
4.) DATABASES

We can construct databases with a AVL tree – usually when we have to


make lot of look-ups (search operations)

• if deletions and insertions are frequent then AVL tree is not the best
option possible because of the huge number of rotations
• trees support quite a lot operations (huge advantage)
• red-black trees and B-trees are more popular

You might also like