2 ComplexityAnalysis
2 ComplexityAnalysis
It is not useful to measure how fast the algorithm runs as this “lepends
on which computer, OS, programming language, compiler, and kind of
inputs are used in testing . fi
Bs Szehl © Phi io sb tg
Instead,
{we count the number ofbasici@perations the algorithm performs.
{JA basic operation is an operation which takeSTalconstantlamountontime to execute.
{we calculate how this number depends on the size of the input.
(The efficiency of an algorithm is the number of basic operations it performs. This number is
a function of the input size n. \
z
LOD op yr Ss Arithmetic operations: *, /, %, +, -
. 2
Spe br Assignment statements of simple data types. —
er be? 2 D7 | — Reading of primitive types 25°" ~
Ne
AR ee Writing of a primitive type J.
‘E XA M P L E Simple conditional tests: if (x < 12)... pb ae
OF BASIC __ etotcavote ne oxctton ime ote mats
Oo P E RATI not be constant) S
A method's return statement
We consider an operation such as ++, += , and *= as
consisting of two basic operations.
Note: To simplify complexity analysis we will not
consider memory access (fetch or store) operations.Faten pes
ahr 3
eee
sowie FUNCTIONS
Bl cotta ) Gyse) Ge een QUA
ie
We must algo’decide what we are trying to efficiently optimize
>. time complexity — CPU time t
Espace complexity — memory soo]
The rates of growth are expressed as functions, which are generally in terms
of the number of inputs n
A growth function shows the relationship between the size of the problem (n)
and the value we hope to optimize (time).
This function represents the time complexity or space comple of the
algorithm. ; ced |
hve 9) gi Pe
#\ ante lA gun cs PaALGORITHM COMPLEXITY
Worst Case Complexity: — feces
The function defined by the maximum number of steps taken on any instance of
size n
Best Case Complexity:
{The function defined by the minimum number of steps taken on any instance of
sizen
Average Case Complexity:
[The function defined by the average number of steps taken on any instance of size
" Fae \ooudno Erk
ALGORITHM COMPLEXITY“ ~~
CONT. ey
We are usually interested in determining the -
largest number of operations that might re oz oe” »
performed for a given problem size.
Best case depends on the input Sx
Worst Case
Complexity
Or d average case
vo
pokey .
yuk ONY
gk >
vo Avo ae ayv
7 a Wd V8 or DY OU
Average case is difficult to compute
So, we usually focus on worst case analysis
{Easier to compute
=f 4, WUsually cose tothe actual running time -
sokALGORITHM COMPLEXITY-
CONT.
Example: Linear Search Complexity
(Best Case: Item found at the beginning:
‘One comparison
{Worst Case: Item found at the end: n
comparisons
(Average Case: Item may be found at index
0,or1, or2,...orn-1
(Average number of comparisons is: (1 + 2
+... #n)/n =(n41)/2
Worst and Average complexities of common
sorting algorithms:
Method Wrst Case | Average Case
Sélection sort \ ne n
Insertion sort) | n? n?
Merge sot 7”) nlogn nioga
Quick sont” ne nlogn
J :
PK soe 8
~
8 e"ALGORITHM COMPLEXITY-
CONT. ot J ge ob poh *
Sonn
Different instances of input result coo
in different worst-case complexities
Thus, it is useful to bound the
worst-case complexities of a given
algorithm
Strategy:
Try to find upper and lower bounds of the
worst-case function
iat kha| RUN-TIME ANALY Siw. algorithms, which solve
suger
Res inte 720
PD Be ipleatior 1 oe,
(7 Sho
Vas Dt os
yeh , 4
ip OP NOW
Ail > Speed
the same problem, we check their
behaviors when they run on the
same instances of input
An algorithm outperforms another
if, for the same instance of input, it
consumes less amount of resources!
" Le., grows slower than the other one
pe be dS
‘Naniber oF pa ems
Probie seae
Dy st
‘pesd De aeoky | Be pre
a
ot a. | oP" She
9S nena s) oe, es
ao ae ae we
ed gl CK Juego
pent, S4 Y ok enh oF
\
ile ope BOA vert ye BEL
ebro 5 SAchaala yee pelo
WY
B gAPaers®
VK DARN 6Gr
| ae Pye
eer
>? hore 1b J! &— rrr Bs * ee
Soe ane
Ke? oo aehe J res er Whe
baggh 2!
me HW
ONL .
A See O36) Fy Lot LP SA ES, Be Gt te. \
Petr ausiyss gees abel oS Pe
A SIMPLE ,COMPARISON h 3
aha, nn ~, Geo) sib Neo
states assume that you have 3 algorithms fo sort alist
=nibgen orb of leg ase
oy, \
“= =n sede DY NSU Sled 1 Meg cay dF
> = hin) =n?
uy — hn) rh
97 sy & Lets also assume that each step takes 1 microsecond (10° “e
ay Which one is better? opt
=z 2 & yr
vy ri aye eo vee een tenn DS pes
oS 17s 28hous 31.7 yeas S gs
OT we
Oe pve peeye Nie Yes Nal oy Yeas?
wa9r8
ae ANA Segal ns SPs! Coe
yr BNW 1 ep* yoae Cee Ww PL, erowe ak —
wseronooe® eb 4) le 6 PS oP \ ows Spee
- cog ba 3 Aim gs or ee
aS pr pore
pohos
vs 6 n= poss
/ a
pom we feb ELS |
oD se
SO
wedMN 7 ko = -
“ “he jab 6 yee TD SS
Oh Nv buys Ce Syuos #
TT) 2, weve C1
° Fo 7 BU EM or og SE Be 2) oo" |
. ayerk, “\ po ou speedn, B oe cat COE sve,
“ASYMPTOTIC ANALYSIS >”
oo
s i
sdoseye MMe Re tce SSP Ve Sy Ea§ pipe | Sys
ad. ‘gid yo sped
" itis not typically necessary to know the exact growth function for an algorithm
Finding the exact complexity, f(n) = number of basic operations, of an
algorithm is difficult a
We are mainly interested in the asymptotic complexity of an algorithm — the
general nature of the algorithm as n increases
ee
Asymptotic analysis of an algorithm describes the relative efficiency of an
algorithm as n gets very large.
When you're dealing with small input size, most algorithms will do
When the input size is Very large) things change
(JE.g. : For very large n, algorithm 1 grows faster than algorithm 2._ ASYMPTOTIC ANALYSIS-
CONT.
(We focus on describing the growth rate of a given function,
say f
(We compare the growth rate of f to the growth rate of some
| | other function, say g
} SB 9
tl | |
[We use asymptotic notations to describe the growth rate(ott )
in terms of g ———
O wi
oi >
call 2ASYMPTOTIC NOTATIONS
Commonly used asymptotic notations to calculate the running time complexity of an
algorithm.
{20 Notation. O(expression) gives an(upper bound)on the growth rate of a function.
A function f<0(g) means, f grows at most as fast as g, asymptotically and up to a constant
(9) g g, asymp! ly ip
factor. - Yb ==
=
(20 Notation. A(expression) gives the growth rate of a function.
A function fe(g) means f grows, asymptotically, atleast as fastas 9. ys\ &
atleast as fast as 9
{20 Notation. @(expression) consist of all the functions that lie in both O(expression) and
(expression). A function fYou might also like