0% found this document useful (0 votes)
79 views

Design and Analysis of Algorithms: Logistics, Introduction, and Multiplication!

The document provides an overview of the first lecture for the course CT109H Design and Analysis of Algorithms. It introduces the instructor and TA, discusses the goals of the course which are to learn algorithm design and analysis. It outlines the course structure including lectures, textbook, homework and exams. It also discusses strategies for succeeding in the course such as attending lectures, doing the readings and homework, and getting help from the instructor as needed. Finally, it provides an example of integer multiplication and introduces the divide and conquer technique for improving the runtime of integer multiplication from quadratic to linear.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
79 views

Design and Analysis of Algorithms: Logistics, Introduction, and Multiplication!

The document provides an overview of the first lecture for the course CT109H Design and Analysis of Algorithms. It introduces the instructor and TA, discusses the goals of the course which are to learn algorithm design and analysis. It outlines the course structure including lectures, textbook, homework and exams. It also discusses strategies for succeeding in the course such as attending lectures, doing the readings and homework, and getting help from the instructor as needed. Finally, it provides an example of integer multiplication and introduces the divide and conquer technique for improving the runtime of integer multiplication from quadratic to linear.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

CT109H

Design and Analysis of Algorithms


Lecture 1:
Logistics, introduction, and multiplication!

Adapted from CS161 - Design and Analysis of Algorithms – Mary Wootters,


Stanford University
Welcome to CT109H !
Who are we? Who are you?
• Instructor: • IT (high quality program)
• Pham Nguyen Khang
• TA:
• None L
Today
• Why are you here?
• Course overview, logistics, and how to succeed in
this course
• Some actual computer science
You are better equipped to
Why are you here? answer this question than I am,
but I’ll give it a go anyway…

• Algorithms are fundamental.


• Algorithms are useful.
• Algorithms are fun!
• CT109H is a required course.

Why is CT109H required?


• Algorithms are fundamental.
• Algorithms are useful.
• Algorithms are fun!
Algorithms are fundamental

The
Operating Systems (CS 140) Machine learning (CS 229) Cryptography (CS 255)
Computational
Lens

Compilers (CS 143)


Networking (CS 144) Computational Biology (CS 262)
Algorithms are useful
• As we get more and more
data and problem sizes get
bigger and bigger,
algorithms become more
and more important.
• Will help you get a job.
Algorithms are fun!
• Algorithm design is both an art and a science.
• Many surprises!
• A young field, lots of exciting research questions!
Today
• Why are you here?
• Course overview, logistics, and how to succeed in
this course.
• Some actual computer science.
Course goals
• The design and analysis of algorithms
• These go hand-in-hand

• In this course you will:


• Learn to think analytically about algorithms
• Flesh out an “algorithmic toolkit”
• Learn to communicate clearly about algorithms
The algorithm designer’s question

Can I do better?

Algorithm designer
The algorithm designer’s internal monologue…
What exactly do we Dude, this is just like
mean by better? And that other time. If you
what about that corner do the thing and the
case? Shouldn’t we be Can I do better? stuff like you did then,
zero-indexing? it’ll totally work real fast!

Plucky the Lucky the


Pedantic Penguin Lackadaisical Lemur
Detail-oriented Algorithm designer Big-picture
Precise Intuitive
Rigorous Hand-wavey
Both sides are necessary!
More detailed schedule on the website!

Roadmap

Asymptotic
Analysis

Dynamic
Greedy Algs Programming
MIDTERM
Graphs!

The
Future!
Course elements and resources
• Course website:
• https://elcit.ctu.edu.vn/course/view.php?id=2858

• Lectures
• Textbook
• Homework
• Exams
How to get the most out of lectures
• During lecture:
• Show up, ask questions, put your phone away.
• May be helpful: take notes on printouts of the slides.
• Before lecture:
• Do the pre-lecture exercises listed on the website.
• After lecture:
• Go through the exercises on the slides.
These guys will pop up
on the slides and ask
questions – those
questions are for you!
Siggi the Studious Stork Ollie the Over-achieving Ostrich
(recommended exercises) (challenge questions)
• Do the reading
• either before or after lecture, whatever works best for you.
• do not wait to “catch up” the week before the exam.
Textbook
• CLRS:
• Introduction to Algorithms,
by Cormen, Leiserson,
Rivest, and Stein.

We will also
sometimes refer to
Kleinberg and Tardos
Homework!
Weekly assignments in two parts:

1. Exercises:
• Check-your-understanding and computations
• Should be pretty straightforward
• Do these on your own

2. Problems:
• Proofs and algorithm design
• Not straightforward
• You may collaborate with your classmates…
How to get the most out of homework
• Do the exercises on your own.

• Try the problems on your own before talking to a


classmate.
• You must write up your solutions on your own.

• If you get help from me (via email or at my office):


• Try the problem first. And then try a few more times.
• Ask: “I was trying this approach and I got stuck here.”
• After you’ve figured it out, write up your solution from scratch.
Exams
• There will be a midterm and a final.
• Assignments (30%)
• MIDTERM (20%)
• FINAL: (50%)
Everyone can succeed in this class!

1. Work hard
2. Ask for help
3. Work hard
Today
• Why are you here?
• Course overview, logistics, and how to succeed in
this course.
• Some actual computer science.
Course goals
• Think analytically about algorithms
• Flesh out an “algorithmic toolkit”
• Learn to communicate clearly about algorithms

Today’s goals
• Karatsuba Integer Multiplication
• Technique: Divide and conquer
• Meta points:
• How do we measure the speed of an algorithm?
Let’s start at the beginning
Etymology of “Algorithm”
• Al-Khwarizmi (Persian mathematician, lived
around 800 AD) wrote a book about how to
multiply with Arabic numerals.

• His ideas came to Europe in the 12th century.

Dixit algorizmi
(so says Al-Khwarizmi)

• Originally, “Algorisme” [old French] referred to


just the Arabic number system, but eventually it
came to mean “Algorithm” as we know today.
This was kind of a big deal
44
XLIV × XCVII = ? x 97
Integer Multiplication

44
x 97
Integer Multiplication

1234567895931413
x 4563823520395533
Integer Multiplication
n

1233925720752752384623764283568364918374523856298
x 4562323582342395285623467235019130750135350013753

???
How long would this take you?

About 𝑛> one-digit operations


At most 𝑛> multiplications,
and then at most 𝑛> additions (for carries)
and then I have to add n different 2n-digit numbers…
Is that a useful answer?
• How do we measure the runtime of an algorithm?

All running the same algorithm…

• We measure how the runtime scales with the size of


the input.
For grade school multiplication,
with python, on my laptop…
highly non-optimized

Looks like it’s roughly


Tlaptop(n) = 0.0063 n2 – 0.5 n + 12.7 ms...
I am a bit slower than my laptop
But the runtime scales like n2 either way.

?@
Tme(n) = + 100
AB
(I made this up)

Tlaptop(n) =
0.0063 n2 – 0.5 n + 12.7 ms
Asymptotic analysis
• How does the runtime scale with the size of the input?
• Runtime of grade school multiplication scales like n2

• We’ll see a more formal definition on next week

Is this a useful answer?


Hypothetically…
A magic algorithm that scales like n1.6

𝑛>
𝑇GH 𝑛 = + 100
10

Tlaptop(n) =
0.0063 n2 – 0.5 n + 12.7 ms
Let n get bigger…
Asymptotic analysis
is a useful notion…
• How does the runtime scale with the size of the input?

• This is our measure of how “fast” an algorithm is.


• We’ll see a more formal definition on next week

• So the question is…


Can we do better?
(than n2?)
𝑛>

𝑛
Let’s dig in to our algorithmic toolkit…
Divide and conquer
Break problem up into smaller (easier) sub-problems

Big problem

Smaller Smaller
problem problem

Often recursively!

Yet smaller Yet smaller Yet smaller Yet smaller


problem problem problem problem
Divide and conquer for multiplication
Break up an integer:
1234 = 12×100 + 34
1234 × 5678
= ( 12×100 + 34 ) ( 56×100 + 78 )
= ( 12 × 56 )10000 + ( 34 × 56 + 12 × 78 )100 + ( 34 × 78 )

1 2 3 4

One 4-digit multiply Four 2-digit multiplies


Suppose n is even
More generally

Break up an n-digit integer:

1 2 3 4

One n-digit multiply Four (n/2)-digit multiplies


Divide and conquer algorithm
not very precisely…
x,y are n-digit numbers
Base case: I’ve
Multiply(𝑥, 𝑦): memorized my 1-
• If n=1: digit multiplication
tables…
• Return xy Say n is even…
L
• Write 𝑥 = 𝑎 10 + 𝑏 @
a, b, c, d are
L n/2-digit numbers
• Write 𝑦 = 𝑐 10 + 𝑑 @
Make this pseudocode
• Recursively compute 𝑎𝑐, 𝑎𝑑, 𝑏𝑐, 𝑏𝑑: more detailed! How
should we handle odd n?
• ac = Multiply(a, c), etc… How should we implement
“multiplication by 10n”?
• Add them up to get 𝑥𝑦:
• xy = ac 10n + (ad + bc) 10n/2 + bd

See the Lecture 1 Python notebook for actual code! Siggi the Studious Stork
How long does this take?
• Better or worse than the grade school algorithm?
• That is, does the number of operations grow like n2 ?
• More or less than that?

• How do we answer this question?


1. Try it.
2. Try to understand it analytically.
Check out the Lecture 1 IPython Notebook
1. Try it.
Conjectures about
running time?
Doesn’t look too good
but hard to tell…

Concerns with the


conclusiveness of this
approach?

Maybe one implementation


is slicker than the other?

Maybe if we were to run it


to n=10000, things would
look different.

Something funny is happening at powers of 2…


2. Try to understand the running
time analytically

• Proof by meta-reasoning:
It must be faster than the grade school
algorithm, because we are learning it in
an algorithms class.
Not sound logic!

Plucky the Pedantic Penguin


2. Try to understand the running
time analytically

• Claim:
The running time of this algorithm is
AT LEAST n2 operations.
How many one-digit multiplies?
12345678 × 87654321

1234 × 8765 5678 × 8765 1234 × 4321 5678 × 4321

12 × 87 56 × 87 12 × 43 56 × 43
34 × 87 78 × 87 34 × 43 78 × 43
12 × 65 56 × 65 12 × 21 56 × 21
34 × 65 78 × 65 34 × 21 78 × 21

1×8 … 3×4 …
1×7
2×8 Claim: there are n2 one-digit problems.
2×7 Every pair of digits still gets multiplied together separately.
etc... So the running time is still at least 𝑛> .
Another way to see this*
*we will come back to
this sort of analysis later
and still more rigorously.

1 problem
of size n • If you cut n in half
log2(n) times,
4 problems you get down to 1.
of size n/2
• So we do this
log2(n) times and
get…
4t problems
4log_2(n) = n2
of size n/2t problems of size 1.

… This is just a lower


bound – we’re just
counting the number
____ problems of size-1 problems!
of size 1
Another way to see this This slide skipped in
class, for reference only.

• Let T(n) be the time to multiply two n-digit numbers.


• Recurrence relation: Ignore this
? term for now…
• 𝑇 𝑛 =4⋅𝑇 + (about n to add stuff up)
>
That’s a bit disappointing
All that work and still (at least) n2…

𝑛>

But wait!!
𝑛
Divide and conquer can actually make progress

• Karatsuba figured out how to do this better!

Need these three things

• If only we recurse three times instead of four…


Karatsuba integer multiplication
• Recursively compute these THREE things:
• ac Subtract these off
get this
• bd
• (a+b)(c+d)
(a+b)(c+d) = ac + bd + bc + ad

• Assemble the product:

✓ ✓ ✓
What’s the running time?
1 problem • If you cut n in half
of size n log2(n) times, you
get down to 1.
3 problems
of size n/2 • So we do this log2(n)
times and get…

3log_2(n) = nlog_2(3) ≈ n1.6


3t problems problems of size 1.
of size n/2t
We still aren’t
accounting for the
… work at the higher
levels! But we’ll see
A.[ later that this turns
𝑛____ problems out to be okay.
of size 1
This is much better!

𝑛>

𝑛A.[

𝑛
We can even see it in real life!
Can we do better?
• Toom-Cook (1963): instead of breaking into three n/2-
sized problems, break into five n/3-sized problems.
• This scales like n1.465
Try to figure out how to break Given that you can break an
up an n-sized problem into five n-sized problem into five
n/3-sized problems! (Hint: start n/3-sized problems, where
with nine n/3-sized problems). does the 1.465 come from?

Ollie the Over-achieving Ostrich Siggi the Studious Stork

• Schönhage–Strassen (1971):
• Scales like n log(n) loglog(n)
• Furer (2007)
• Scales like n log(n) 2log*(n)
[This is just for fun, you don’t need
to know these algorithms!]
Course goals
• Think analytically about algorithms
• Flesh out an “algorithmic toolkit”
• Learn to communicate clearly about algorithms

Today’s goals
• Karatsuba Integer Multiplication
• Technique: Divide and conquer
• Meta points:
• How do we measure the speed of an algorithm?

Wrap up
Wrap up
• https://elcit.ctu.edu.vn/course/view.php?id=2858

• Algorithms are:
• Fundamental, useful, and fun!
• In this course, we will develop both algorithmic
intuition and algorithmic technical chops
• It might not be easy but it will be worth it!

• Karatsuba Integer Multiplication:


• You can do better than grade school multiplication!
• Example of divide-and-conquer in action
• Informal demonstration of asymptotic analysis
Next time
• Sorting!
• Divide and Conquer some more
• Begin Asymptotic and Big-Oh notation

BEFORE Next time


• Pre-lecture exercise! On the course website!

You might also like