OAP 2
OAP 2
Algorithm
1 Duality
1.1 Primal and Dual Problems
Given a primal linear program (P) in canonical form:
(P) max Z = cT x
s.t. Ax ≤ b
x≥0
(D) min w = y T b
s.t. y T A ≥ cT
y≥0
1
1.4 Complementary Slackness
Theorem 3. Let x and y be feasible solutions for the primal and dual problems. They are optimal if
and only if:
y T (b − Ax) = 0
(y T A − cT )x = 0
This means:
• If a primal constraint is not binding (Ai x < bi ), the corresponding dual variable is zero (yi = 0).
• If a primal variable is positive (xj > 0), the corresponding dual constraint is binding ((y T A)j = cj ).
• If a dual variable is positive (yi > 0), the corresponding primal constraint is binding (Ai x = bi ).
• If a dual constraint is not binding ((y T A)j > cj ), the corresponding primal variable is zero (xj = 0).
2 Sensitivity Analysis
2.1 Change in Objective Function Coefficient (cj )
Objective: Find the range for cj (coefficient of variable xj ) such that the current optimal basis remains
optimal. Method:
1. Let the change be ∆. The new coefficient is cj + ∆.
2. If xj is non-basic: The reduced cost must remain non-positive (for max problems). Calculate the
new reduced cost c̄′j = c̄j + ∆. Require c̄j + ∆ ≤ 0.
3. If xj is basic (let xj = xBk ): The reduced costs of all non-basic variables must remain non-positive.
Calculate the new reduced costs c̄′N using the modified cB . This usually involves finding the change
in cTB B −1 and applying it to the original reduced costs. c̄′N = cTN − (cB + ∆ek )T B −1 N ≤ 0.
4. Solve the resulting inequalities for ∆.
2. The new values of the basic variables are x′B = B −1 (b + ∆ei ) = B −1 b + ∆(B −1 ei ) = xB +
∆(B −1 column for ei ).
3. Require x′B ≥ 0. This gives inequalities involving ∆.
4. Solve these inequalities for ∆. (Note: B −1 column for ei is the column corresponding to the slack
variable ei in the final simplex tableau).
2
3 Simplex Algorithm
3.1 Standard Form
A linear program is in standard form if:
max Z = cT x
s.t. Ax = b
x≥0
Where b ≥ 0. Problems with ≤ constraints are converted using slack variables, and ≥ constraints with
surplus and artificial variables.
4. Leaving Variable: Calculate ratios θi = (B −1 b)i /(B −1 Ak )i for all i where (B −1 Ak )i > 0. Choose
the row r corresponding to the minimum ratio θmin . The basic variable xBr in row r leaves the
basis. If all (B −1 Ak )i ≤ 0, the problem is unbounded. STOP.
5. Pivot: Perform row operations to make the pivot element (B −1 Ak )r equal to 1 and all other
elements in the pivot column equal to 0. This updates the tableau (B −1 , B −1 b, c̄T , Z). Go to
Step 2.
• Unbounded Solution: If, during the ratio test (Step 4), all coefficients in the pivot column
(B −1 Ak ) are ≤ 0 for the entering variable xk , the objective function can be increased indefinitely.
• No Feasible Solution: If Phase I (or Big M) ends with an artificial variable still in the basis with
a positive value, the original problem has no feasible solution.
3
3.5 Reduced Costs
The coefficients of the non-basic variables in the objective function row of the simplex tableau are the
reduced costs (or marginal costs). At optimality for a max problem, all reduced costs are ≤ 0.
max Z = cT x
s.t. Ax ≤ b
x ≥ 0, some or all xj ∈ Z
The feasible region is a set of discrete points, not a continuous polyhedron. The optimal solution to the
LP relaxation (dropping integrality) is generally not the same as the ILP optimal solution.
4
3. Bound Computation (Step 2): Solve the LP relaxation of S i . Let the optimal value be z i and
solution xi .
4. Pruning (Step 3):
where aj is the j-th column of A, and ⌊·⌋ denotes the floor function (rounding down). This
inequality is valid for S because xj ≥ 0 and integer.
Theorem 4. Any valid inequality for S can be derived by applying the C-G procedure a finite number of
times.
5
6.4 Gomory’s Fractional Cut (Derived from Simplex Tableau)
A specific way to generate a C-G cut directly from an optimal LP relaxation tableau where a basic
variable xB i is fractional.
P
1. Consider the tableau row for xBi : xBi + j∈N āij xj = b̄i , where N is the set of non-basic variables,
b̄i is fractional.
2. Rewrite using floor function: ⌊āij ⌋ and fractional part fij = āij − ⌊āij ⌋ ≥ 0. Similarly fi =
b̄i − ⌊b̄i ⌋ > 0. X
xBi + (⌊āij ⌋ + fij )xj = ⌊b̄i ⌋ + fi
j∈N
3. Rearrange: X X
fij xj − fi = ⌊b̄i ⌋ − xBi − ⌊āij ⌋xj
j∈N j∈N
4. The RHS must be an integer for any feasible integer solution. Therefore, the LHS must also be an
integer.
P
5. Since xj ≥ 0 and fij ≥ 0, we have j∈N fij xj ≥ 0.
6. Also, fi > 0. So, the smallest integer value the LHS can take is 0.
P
7. This implies j∈N fij xj − fi ≥ −fi . Since the LHS must be an integer, it must be ≥ 0.
8. Gomory’s Fractional Cut: X
fij xj ≥ fi
j∈N
This cut is violated by the current LP solution (where xj = 0 for j ∈ N , making LHS=0, but
fi > 0) but holds for all integer solutions.
Note: Can be slow to converge. Adding cuts a priori can strengthen the formulation but may make the
LP large.
6
• Pruning rules (optimality, bound, infeasibility) are the same as B&B.
B&C is often more effective than pure B&B or pure Cutting Planes, especially for hard combinatorial
problems like the Traveling Salesman Problem (TSP).
min cT x
s.t. Ax = b
x≥0
Solving this efficiently depends on the structure of the original problem (e.g., shortest path problem
in cutting stock or vehicle routing).
4. Check Optimality: If the minimum reduced cost found in Step 3 is non-negative (min c̄k ≥
0), then no non-basic variable can improve the current RMP solution. The current solution x∗R
(extended with zeros for variables not in RMP) is optimal for the original MP. STOP.
5. Add Column(s): If min c̄k < 0, add the column Ak (corresponding to the variable xk with the
most negative reduced cost) to the RMP. Optionally, add multiple columns with negative reduced
costs.
7
8.4 Dantzig-Wolfe (DW) Decomposition
A specific reformulation technique that leads to a column generation approach. It applies when the
constraint matrix has a special block-angular structure:
min cT x
s.t. A0 x = b0 (Linking/Complicating Constraints)
I I
A x=b (Easy/Block Constraints)
x≥0
Substituting this into the original problem gives the DW Master Problem (variables are ui ):
X
min (cT x(i) )ui
i
X
s.t. (A0 x(i) )ui = b0 (Dual vars: π)
i
X
ui = 1 (Dual var: ν)
i
ui ≥ 0
This MP usually has far too many columns (one for each extreme point x(i) ). CG is used to solve it.
• RMP: The DW Master Problem restricted to a subset of known extreme points x(i) .
• Subproblem (Pricing Problem): Given duals (π, ν) from the RMP, find an extreme point x∗
of XI = {x ≥ 0|AI x = bI } that minimizes the reduced cost. The reduced cost for a column
corresponding to x(i) is (cT x(i) ) − π T (A0 x(i) ) − ν. The subproblem is:
This involves optimizing a modified objective function over the ’easy’ constraints AI x = bI , x ≥ 0.
If the optimal value is < 0, the corresponding x∗ generates a column with negative reduced cost to
add to the RMP.
For block-diagonal structures (multiple independent blocks linked by common constraints), DW decom-
position leads to multiple independent subproblems, one for each block.