0% found this document useful (0 votes)
103 views

Algo 09

This document discusses approximation algorithms for NP-complete problems. It begins by defining approximation algorithms as algorithms that find solutions close to optimal when an exact optimal solution requires exponential time. It then discusses different types of approximation ratios for optimization problems. Several approximation algorithms are presented for problems like the traveling salesperson problem, bottleneck traveling salesperson problem, and bin packing problem. For each problem, the algorithm is described and its approximation ratio or guarantee is proven. The document concludes by discussing polynomial-time approximation schemes.

Uploaded by

arunenggece
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views

Algo 09

This document discusses approximation algorithms for NP-complete problems. It begins by defining approximation algorithms as algorithms that find solutions close to optimal when an exact optimal solution requires exponential time. It then discusses different types of approximation ratios for optimization problems. Several approximation algorithms are presented for problems like the traveling salesperson problem, bottleneck traveling salesperson problem, and bin packing problem. For each problem, the algorithm is described and its approximation ratio or guarantee is proven. The document concludes by discussing polynomial-time approximation schemes.

Uploaded by

arunenggece
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 39

9-1

Chapter 9
Approximation Algorithms
NP-Complete Problem
Enumeration
Branch an Bound
Greedy
Approximation
PTAS
K-Approximation
No Approximation
9-2
9-3
Approximation algorithm
Up to now, the best algorithm for
solving an NP-complete problem
requires exponential time in the worst
case. It is too time-consuming.
To reduce the time required for solving
a problem, we can relax the problem,
and obtain a feasible solution close to
an optimal solution
Approximation Ratios
Optimization Problems
We have some problem instance x that has many feasible
solutions.
We are trying to minimize (or maximize) some cost
function c(S) for a solution S to x.
For example,
Finding a minimum spanning tree of a graph
Finding a smallest vertex cover of a graph
Finding a smallest traveling salesperson tour in a graph
9-4
Approximation Ratios
An approximation produces a solution T

Relative approximation ratio
T is a k-approximation to the optimal solution OPT
if c(T)/c(OPT) < k (assuming a minimizing problem;
a maximization approximation would be the reverse)

Absolute approximation ratio
For example, chromatic number problem
If the optimal solution of this instance is three and the
approximation T is four, then T is a 1-approximation to the
optimal solution.



9-5
9-6
The Euclidean traveling
salesperson problem (ETSP)
The ETSP is to find a shortest closed path
through a set S of n points in the plane.
The ETSP is NP-hard.

9-7
An approximation algorithm for ETSP
Input: A set S of n points in the plane.
Output: An approximate traveling salesperson
tour of S.
Step 1: Find a minimal spanning tree T of S.
Step 2: Find a minimal Euclidean weighted
matching M on the set of vertices of odd
degrees in T. Let G=MT.
Step 3: Find an Eulerian cycle of G and then
traverse it to find a Hamiltonian cycle as an
approximate tour of ETSP by bypassing all
previously visited vertices.
9-8
Step1: Find a minimal spanning tree.
An example for ETSP algorithm
9-9
Step2: Perform weighted matching. The
number of points with odd degrees must be
even because is even.

=
=
n
i
i
E d
1
2
9-10
Step3: Construct the tour with an Eulerian
cycle and a Hamiltonian cycle.
9-11
Time complexity: O(n
3
)
Step 1: O(nlogn)
Step 2: O(n
3
)
Step 3: O(n)

How close the approximate solution to an
optimal solution?
The approximate tour is within 3/2 of the optimal
one. (The approximate rate is 3/2.)
(See the proof on the next page.)


9-12
Proof of approximate rate
optimal tour L: j
1
i
1
j
2
i
2
j
3
i
2m
{i
1
,i
2
,,i
2m
}: the set of odd degree vertices in T.
2 matchings: M
1
={[i
1
,i
2
],[i
3
,i
4
],,[i
2m-1
,i
2m
]}
M
2
={[i
2
,i
3
],[i
4
,i
5
],,[i
2m
,i
1
]}
length(L)> length(M
1
) + length(M
2
) (triangular inequality)
> 2 length(M )
length(M)s 1/2 length(L )
G = TM
length(T) + length(M) s length(L) + 1/2 length(L)
= 3/2 length(L)
9-13
The bottleneck traveling
salesperson problem (BTSP)
Minimize the longest edge of a tour.
This is a mini-max problem.
This problem is NP-hard.
The input data for this problem fulfill
the following assumptions:
The graph is a complete graph.
All edges obey the triangular inequality
rule.


9-14
An algorithm for finding an
optimal solution
Step1: Sort all edges in G = (V,E) into a
nondecresing sequence |e
1
|s|e
2
|ss|e
m
|.
Let G(e
i
) denote the subgraph obtained from
G by deleting all edges longer than e
i
.
Step2: i1
Step3: If there exists a Hamiltonian cycle in
G(e
i
), then this cycle is the solution and stop.
Step4: ii+1 . Go to Step 3.

9-15
An example for BTSP algorithm
e.g.
There is a Hamiltonian
cycle, A-B-D-C-E-F-G-A, in
G(BD).
The optimal solution is 13.
9-16
Theorem for Hamiltonian cycles
Def : The t-th power of G=(V,E), denoted as
G
t
=(V,E
t
), is a graph that an edge (u,v)eE
t
if
there is a path from u to v with at most t
edges in G.
Theorem: If a graph G is bi-connected, then
G
2
has a Hamiltonian cycle.
9-17
An example for the theorem
A Hamiltonian cycle:
A-B-C-D-E-F-G-A

G
2

9-18
An approximation algorithm for BTSP
Input: A complete graph G=(V,E) where all edges
satisfy triangular inequality.
Output: A tour in G whose longest edges is not
greater than twice of the value of an optimal solution
to the special bottleneck traveling salesperson
problem of G.
Step 1: Sort the edges into |e
1
|s|e
2
|ss|e
m
|.
Step 2: i := 1.
Step 3: If G(e
i
) is bi-connected, construct G(e
i
)
2
, find a
Hamiltonian cycle in G(e
i
)
2
and return this as the
output.
Step 4: i := i + 1. Go to Step 3.
9-19
An example
Add some more edges.
Then it becomes bi-
connected.
9-20
A Hamiltonian
cycle: A-G-F-E-D-
C-B-A.
The longest edge:
16
Time complexity:
polynomial time
9-21
How good is the solution ?
The approximate solution is bounded by two
times an optimal solution.
Reasoning:
A Hamiltonian cycle is bi-connected.
e
op
: the longest edge of an optimal solution
G(e
i
): the first bi-connected graph
|e
i
|s|e
op
|
The length of the longest edge in G(e
i
)
2
s2|e
i
|
(triangular inequality) s2|e
op
|

9-22
NP-completeness
Theorem: If there is a polynomial
approximation algorithm which produces a
bound less than two, then NP=P.
(The Hamiltonian cycle decision problem
reduces to this problem.)
Proof:
For an arbitrary graph G=(V,E), we expand G to a
complete graph G
c
:
C
ij
= 1 if (i,j) e E
C
ij
= 2 if otherwise
(The definition of C
ij
satisfies the triangular inequality.)

9-23
Let V
*
denote the value of an optimal solution of the
bottleneck TSP of G
c
.
V
*
= 1 G has a Hamiltonian cycle

Because there are only two kinds of edges, 1 and 2
in G
c
, if we can produce an approximate solution
whose value is less than 2V
*
, then we can also solve
the Hamiltonian cycle decision problem.

9-24
The bin packing problem
n items a
1
, a
2
, , a
n
, 0< a
i
s 1, 1 s i s n, to
determine the minimum number of bins of
unit capacity to accommodate all n items.
E.g. n = 5, {0.3, 0.5, 0.8, 0.2 0.4}
The bin packing problem is NP-hard.
9-25
An approximation algorithm
for the bin packing problem
An approximation algorithm:
(first-fit) place a
i
into the lowest-indexed
bin which can accommodate a
i
.

Theorem: The number of bins used in the
first-fit algorithm is at most twice of the
optimal solution.
9-26

Notations:
S(a
i
): the size of a
i
OPT(I): the size of an optimal solution of an instance I
FF(I): the size of bins in the first-fit algorithm
C(B
i
): the sum of the sizes of a
j
s packed in bin B
i
in
the first-fit algorithm

OPT(I) >
C(B
i
) + C(B
i+1
) > 1
m nonempty bins are used in FF:
C(B
1
)+C(B
2
)++C(B
m
) > m/2
FF(I) = m < 2 = 2 s 2 OPT(I)

FF(I) < 2 OPT(I)

=
n
i
i
a S
1
) (

=
m
i
i
B C
1
) (

=
n
i
i
a S
1
) (
Proof of the approximate rate
Knapsack problem
Fractional knapsack problem
P
0/1 knapsack problem
NP-Complete
Approximation
PTAS
9-27
4 -28
Fractional knapsack problem
n objects, each with a weight w
i
> 0
a profit p
i
> 0
capacity of knapsack: M

Maximize
Subject to
0 s x
i
s 1, 1 s i s n
p x
i i
i n 1s s

w x M
i i
i n 1s s

s
4 -29
The knapsack algorithm
The greedy algorithm:
Step 1: Sort p
i
/w
i
into nonincreasing order.
Step 2: Put the objects into the knapsack according
to the sorted sequence as possible as we can.
e. g.
n = 3, M = 20, (p
1
, p
2
, p
3
) = (25, 24, 15)
(w
1
, w
2
, w
3
) = (18, 15, 10)
Sol: p
1
/w
1
= 25/18 = 1.32
p
2
/w
2
= 24/15 = 1.6
p
3
/w
3
= 15/10 = 1.5
Optimal solution: x
1
= 0, x
2
= 1, x
3
= 1/2
3- 30
0/1 knapsack problem
Def: n objects, each with a weight w
i
> 0
a profit p
i
> 0
capacity of knapsack : M
Maximize p
i
x
i

1sisn
Subject to w
i
x
i
s M
1sisn
x
i
= 0 or 1, 1s i sn
Decision version :
Given K, - p
i
x
i
> K ?
1sisn
Knapsack problem : 0 s x
i
s 1, 1s i sn.
<Theorem> partition 0/1 knapsack decision
problem.

Polynomial-Time
Approximation Schemes

A problem L has a polynomial-time approximation
scheme (PTAS) if it has a polynomial-time
(1+)-approximation algorithm, for any fixed >0
(this value can appear in the running time).

0/1 Knapsack has a PTAS, with a running time that is
O(n^3 / ).

9-31
Knapsack: PTAS
Intuition for approximation algorithm.
Given a error ration , we calculate a threshold to classify
items
BIG enumeration ; SMALL greedy







In our case, T will be found to be 46.8

. Thus BIG = {1, 2, 3} and
SMALL = {4, 5, 6, 7, 8}.

9-32
i 1 2 3 4 5 6 7 8
p
i
90 61 50 33 29 23 15 13
w
i
33 30 25 17 15 12 10 9
p
i
/w
i
2.72 2.03 2.0 1.94 1.93 1.91 1.5 1.44
Knapsack: PTAS
For the BIG, we try to enumerate all possible solutions.

Solution 1:
We select items 1 and 2. The sum of normalized
profits is 15. The corresponding sum of original profits
is 90

+

61 = 151. The sum of weights is 63.

Solution 2:
We select items 1, 2, and 3. The sum of normalized
profits is 20. The corresponding sum of original profits
is 90

+

61

+

50 = 201. The sum of weights is 88.


9-33
Knapsack: PTAS
For the SMALL, we use greedy strategy to find a
possible solutions.

Solution 1:
For Solution 1, we can add items 4 and 6. The sum of profits
will be 151

+

33

+

23 = 207.

Solution 2:
For Solution 2, we can not add any item from SMALL. Thus
the sum of profits is 201.


9-34
9-35
A bad example
A convex hull of n points in the plane can be
computed in O(nlogn) time in the worst case.
An approximation algorithm:
Step1: Find the leftmost and rightmost points.
9-36
Step2: Divide the points into K strips. Find the
highest and lowest points in each strip.
9-37
Step3: Apply the Graham scan to those highest
and lowest points to construct an
approximate convex hull. (The highest and
lowest points are already sorted by their x-
coordinates.)
9-38
Time complexity
Time complexity: O(n+k)
Step 1: O(n)
Step 2: O(n)
Step 3: O(k)

9-39
How good is the solution ?
How far away the points outside are from the
approximate convex hull?
Answer: L/K.
L: the distance between the leftmost and
rightmost points.

You might also like