Complementary Slackness Conditions in Linear Programming

Author

Your Name

Published

January 30, 2025

Introduction

This lecture explores the concept of **complementary slackness conditions** within the framework of linear programming. These conditions serve as a powerful tool for assessing the optimality of a pair of primal and dual solutions. We will examine the underlying rationale for these conditions, formally present the complementary slackness theorem, and furnish a comprehensive proof. Additionally, we will analyze these conditions from a geometric standpoint and provide a practical example to demonstrate their application. A thorough understanding of complementary slackness is essential for gaining a deeper understanding of the relationship between primal and dual linear programs and for confirming the optimality of solutions without the need to solve the problem from the ground up.

Complementary Slackness Conditions

Motivation and Definition

The concept of complementary slackness stems from the strong duality theorem in linear programming. This theorem asserts that if both the primal and dual problems possess feasible solutions, then both have optimal solutions, and their optimal objective values coincide. Complementary slackness conditions establish a set of necessary and sufficient conditions for the optimality of primal and dual solutions. They effectively link the variables of one problem to the constraints of the other, providing valuable insights into the structure of optimal solutions.

Complementary slackness conditions are a set of conditions that must be satisfied by optimal primal and dual solutions in linear programming. These conditions relate the primal variables to the dual constraints and the dual variables to the primal constraints, specifically focusing on the "slack" or surplus in these relationships at optimality.

Primal and Dual Linear Programs

Canonical Form

Consider a primal linear program in canonical form: \[\begin{aligned} \text{Maximize } & \mathbf{c}^T \mathbf{x} \\ \text{subject to } & A\mathbf{x} \leq \mathbf{b} \\ & \mathbf{x} \geq \mathbf{0} \end{aligned}\] where \(\mathbf{x} \in \mathbb{R}^n\), \(\mathbf{c} \in \mathbb{R}^n\), \(\mathbf{b} \in \mathbb{R}^m\), and \(A \in \mathbb{R}^{m \times n}\).

The corresponding dual linear program in canonical form is: \[\begin{aligned} \text{Minimize } & \mathbf{b}^T \mathbf{u} \\ \text{subject to } & A^T\mathbf{u} \geq \mathbf{c} \\ & \mathbf{u} \geq \mathbf{0} \end{aligned}\] where \(\mathbf{u} \in \mathbb{R}^m\).

Statement of the Complementary Slackness Theorem

Let \(\mathbf{x}^* \in \mathbb{R}^n\) be a primal feasible solution and \(\mathbf{u}^* \in \mathbb{R}^m\) be a dual feasible solution for the primal and dual linear programs defined above. Then, \(\mathbf{x}^*\) is an optimal solution to the primal problem and \(\mathbf{u}^*\) is an optimal solution to the dual problem if and only if the following complementary slackness conditions are satisfied:

  1. Primal Feasibility: \(A\mathbf{x}^* \leq \mathbf{b}\) and \(\mathbf{x}^* \geq \mathbf{0}\).

  2. Dual Feasibility: \(A^T\mathbf{u}^* \geq \mathbf{c}\) and \(\mathbf{u}^* \geq \mathbf{0}\).

  3. Complementary Slackness Condition 1: For each dual variable \(u_i^*\), \(i = 1, \dots, m\): \[u_i^* (b_i - \sum_{j=1}^{n} a_{ij}x_j^*) = 0\]

  4. Complementary Slackness Condition 2: For each primal variable \(x_j^*\), \(j = 1, \dots, n\): \[x_j^* (\sum_{i=1}^{m} a_{ij}u_i^* - c_j) = 0\]

Primal Feasibility

The first condition simply states that \(\mathbf{x}^*\) must be a feasible solution for the primal linear program. This means that \(\mathbf{x}^*\) must satisfy all the constraints of the primal problem.

Dual Feasibility

The second condition states that \(\mathbf{u}^*\) must be a feasible solution for the dual linear program, satisfying all dual constraints.

Complementary Slackness Condition 1

The third condition relates each dual variable \(u_i^*\) to the \(i\)-th primal constraint. The term \((b_i - \sum_{j=1}^{n} a_{ij}x_j^*)\) represents the slack in the \(i\)-th primal constraint. This condition implies that for each \(i\), either \(u_i^* = 0\) or the \(i\)-th primal constraint is binding (i.e., \(b_i - \sum_{j=1}^{n} a_{ij}x_j^* = 0\)). In other words, if a dual variable is positive, then its corresponding primal constraint must be active (satisfied as an equality). Conversely, if a primal constraint is not binding (has positive slack), then its corresponding dual variable must be zero.

Complementary Slackness Condition 2

The fourth condition relates each primal variable \(x_j^*\) to the \(j\)-th dual constraint. The term \((\sum_{i=1}^{m} a_{ij}u_i^* - c_j)\) represents the slack in the \(j\)-th dual constraint. This condition implies that for each \(j\), either \(x_j^* = 0\) or the \(j\)-th dual constraint is binding (i.e., \(\sum_{i=1}^{m} a_{ij}u_i^* - c_j = 0\)). If a primal variable is positive, then its corresponding dual constraint must be active. Conversely, if a dual constraint is not binding (has positive slack), then its corresponding primal variable must be zero.

Interpretation of Slack

Primal Slack

For the \(i\)-th primal constraint \(\sum_{j=1}^{n} a_{ij}x_j \leq b_i\), the primal slack \(s_i^p\) is defined as: \[s_i^p = b_i - \sum_{j=1}^{n} a_{ij}x_j\] For a feasible solution, \(s_i^p \geq 0\). A primal constraint is binding or active if \(s_i^p = 0\), and non-binding or inactive if \(s_i^p > 0\).

Dual Slack

For the \(j\)-th dual constraint \(\sum_{i=1}^{m} a_{ij}u_i \geq c_j\), the dual slack \(s_j^d\) is defined as: \[s_j^d = \sum_{i=1}^{m} a_{ij}u_i - c_j\] For a feasible solution, \(s_j^d \geq 0\). A dual constraint is binding or active if \(s_j^d = 0\), and non-binding or inactive if \(s_j^d > 0\).

Reformulation Using Slack Variables

Using these definitions of slack, the complementary slackness conditions 3 and 4 can be rewritten as:

  1. \(u_i^* s_i^p = 0\) for all \(i = 1, \dots, m\).

  2. \(x_j^* s_j^d = 0\) for all \(j = 1, \dots, n\).

Proof of the Complementary Slackness Theorem

Forward Implication: Optimality Implies Complementary Slackness

We will prove that if \(\mathbf{x}^*\) is an optimal primal solution and \(\mathbf{u}^*\) is an optimal dual solution, then conditions 1-4 hold.

Using Strong Duality

If \(\mathbf{x}^*\) and \(\mathbf{u}^*\) are optimal solutions for the primal and dual problems respectively, then by the strong duality theorem, we have: \[\mathbf{c}^T \mathbf{x}^* = \mathbf{b}^T \mathbf{u}^*\] Also, since \(\mathbf{u}^*\) is dual feasible, we know that \(A^T\mathbf{u}^* \geq \mathbf{c}\), which means \(\mathbf{c}^T \leq (\mathbf{u}^*)^T A\). Since \(\mathbf{x}^* \geq \mathbf{0}\), we have: \[\mathbf{c}^T \mathbf{x}^* \leq (\mathbf{u}^*)^T A \mathbf{x}^* = (\mathbf{u}^*)^T (A \mathbf{x}^*)\] Since \(\mathbf{x}^*\) is primal feasible, \(A\mathbf{x}^* \leq \mathbf{b}\) and \(\mathbf{u}^* \geq \mathbf{0}\), we have: \[(\mathbf{u}^*)^T (A \mathbf{x}^*) \leq (\mathbf{u}^*)^T \mathbf{b} = \mathbf{b}^T \mathbf{u}^*\] Combining these inequalities, we get: \[\mathbf{c}^T \mathbf{x}^* \leq (\mathbf{u}^*)^T A \mathbf{x}^* \leq \mathbf{b}^T \mathbf{u}^*\] Since strong duality states \(\mathbf{c}^T \mathbf{x}^* = \mathbf{b}^T \mathbf{u}^*\), both inequalities must hold as equalities: \[\mathbf{c}^T \mathbf{x}^* = (\mathbf{u}^*)^T A \mathbf{x}^* = \mathbf{b}^T \mathbf{u}^*\] These equalities are crucial for deriving conditions 3 and 4.

Derivation of Condition 3

From \((\mathbf{u}^*)^T (A \mathbf{x}^*) = \mathbf{b}^T \mathbf{u}^*\), we can rewrite it as: \[\mathbf{b}^T \mathbf{u}^* - (\mathbf{u}^*)^T (A \mathbf{x}^*) = 0\] \[(\mathbf{u}^*)^T \mathbf{b} - (\mathbf{u}^*)^T (A \mathbf{x}^*) = 0\] \[(\mathbf{u}^*)^T (\mathbf{b} - A \mathbf{x}^*) = 0\] This is the dot product of two vectors \(\mathbf{u}^*\) and \((\mathbf{b} - A \mathbf{x}^*)\). In component form, this is: \[\sum_{i=1}^{m} u_i^* (b_i - \sum_{j=1}^{n} a_{ij}x_j^*) = 0\] Since \(\mathbf{u}^* \geq \mathbf{0}\) and \((\mathbf{b} - A \mathbf{x}^*) \geq \mathbf{0}\) (due to primal feasibility), each term in the sum is non-negative. For their sum to be zero, each term must be zero. Therefore, for each \(i = 1, \dots, m\): \[u_i^* (b_i - \sum_{j=1}^{n} a_{ij}x_j^*) = 0\] This is Complementary Slackness Condition 1.

Derivation of Condition 4

From \(\mathbf{c}^T \mathbf{x}^* = (\mathbf{u}^*)^T A \mathbf{x}^*\), we can rewrite it as: \[(\mathbf{u}^*)^T A \mathbf{x}^* - \mathbf{c}^T \mathbf{x}^* = 0\] \[((\mathbf{u}^*)^T A - \mathbf{c}^T) \mathbf{x}^* = 0\] \[(A^T \mathbf{u}^* - \mathbf{c})^T \mathbf{x}^* = 0\] This is the dot product of two vectors \((A^T \mathbf{u}^* - \mathbf{c})\) and \(\mathbf{x}^*\). In component form: \[\sum_{j=1}^{n} (\sum_{i=1}^{m} a_{ij}u_i^* - c_j) x_j^* = 0\] Since \(\mathbf{x}^* \geq \mathbf{0}\) and \((A^T \mathbf{u}^* - \mathbf{c}) \geq \mathbf{0}\) (due to dual feasibility), each term in the sum is non-negative. For their sum to be zero, each term must be zero. Therefore, for each \(j = 1, \dots, n\): \[x_j^* (\sum_{i=1}^{m} a_{ij}u_i^* - c_j) = 0\] This is Complementary Slackness Condition 2. Conditions 1 and 2 (Primal and Dual Feasibility) are already given by the assumption that \(\mathbf{x}^*\) and \(\mathbf{u}^*\) are feasible solutions. Thus, we have shown that optimality implies complementary slackness conditions 1-4.

Reverse Implication: Complementary Slackness Implies Optimality

We now prove that if conditions 1-4 hold for a primal feasible solution \(\mathbf{x}^*\) and a dual feasible solution \(\mathbf{u}^*\), then \(\mathbf{x}^*\) and \(\mathbf{u}^*\) are optimal solutions for their respective problems.

Using Weak Duality

We are given that conditions 1-4 hold. From condition 3, we have \(\sum_{i=1}^{m} u_i^* (b_i - \sum_{j=1}^{n} a_{ij}x_j^*) = 0\), which expands to: \[\sum_{i=1}^{m} u_i^* b_i - \sum_{i=1}^{m} \sum_{j=1}^{n} u_i^* a_{ij}x_j^* = 0\] \[\mathbf{b}^T \mathbf{u}^* - (\mathbf{u}^*)^T A \mathbf{x}^* = 0\] \[\mathbf{b}^T \mathbf{u}^* = (\mathbf{u}^*)^T A \mathbf{x}^*\] From condition 4, we have \(\sum_{j=1}^{n} x_j^* (\sum_{i=1}^{m} a_{ij}u_i^* - c_j) = 0\), which expands to: \[\sum_{j=1}^{n} \sum_{i=1}^{m} x_j^* a_{ij}u_i^* - \sum_{j=1}^{n} x_j^* c_j = 0\] \[(A^T \mathbf{u}^*)^T \mathbf{x}^* - \mathbf{c}^T \mathbf{x}^* = 0\] \[(\mathbf{u}^*)^T A \mathbf{x}^* - \mathbf{c}^T \mathbf{x}^* = 0\] \[(\mathbf{u}^*)^T A \mathbf{x}^* = \mathbf{c}^T \mathbf{x}^*\] Combining these two results, we get: \[\mathbf{c}^T \mathbf{x}^* = (\mathbf{u}^*)^T A \mathbf{x}^* = \mathbf{b}^T \mathbf{u}^*\] Thus, \(\mathbf{c}^T \mathbf{x}^* = \mathbf{b}^T \mathbf{u}^*\). Since \(\mathbf{x}^*\) is primal feasible and \(\mathbf{u}^*\) is dual feasible, and their objective values are equal, by weak duality (and in fact, strong duality), \(\mathbf{x}^*\) must be optimal for the primal problem and \(\mathbf{u}^*\) must be optimal for the dual problem. This completes the proof of the reverse implication.

Geometric Interpretation and Orthogonality

Slack Vectors

Definition of Primal Slack Vector

The primal slack vector \(\mathbf{s}^p \in \mathbb{R}^m\) for a primal feasible solution \(\mathbf{x}\) is defined as: \[\mathbf{s}^p = \mathbf{b} - A\mathbf{x}\] The \(i\)-th component of \(\mathbf{s}^p\) is \(s_i^p = b_i - \sum_{j=1}^{n} a_{ij}x_j\), representing the slack in the \(i\)-th primal constraint. Since \(A\mathbf{x} \leq \mathbf{b}\), we have \(\mathbf{s}^p \geq \mathbf{0}\).

Definition of Dual Slack Vector

The dual slack vector \(\mathbf{s}^d \in \mathbb{R}^n\) for a dual feasible solution \(\mathbf{u}\) is defined as: \[\mathbf{s}^d = A^T\mathbf{u} - \mathbf{c}\] The \(j\)-th component of \(\mathbf{s}^d\) is \(s_j^d = \sum_{i=1}^{m} a_{ij}u_i - c_j\), representing the slack in the \(j\)-th dual constraint. Since \(A^T\mathbf{u} \geq \mathbf{c}\), we have \(\mathbf{s}^d \geq \mathbf{0}\).

Orthogonality

The complementary slackness conditions can be interpreted in terms of orthogonality between certain vectors.

Orthogonality between Primal Variables and Dual Slack

Complementary slackness condition 2, \(x_j^* s_j^d = 0\) for all \(j\), can be written in vector form as: \[(\mathbf{x}^*)^T \mathbf{s}^d = 0\] This means that the vector of optimal primal variables \(\mathbf{x}^*\) is orthogonal to the vector of dual slacks \(\mathbf{s}^d\). Since both \(\mathbf{x}^* \geq \mathbf{0}\) and \(\mathbf{s}^d \geq \mathbf{0}\), this orthogonality implies that for each component \(j\), at least one of \(x_j^*\) or \(s_j^d\) must be zero.

Orthogonality between Dual Variables and Primal Slack

Complementary slackness condition 1, \(u_i^* s_i^p = 0\) for all \(i\), can be written in vector form as: \[(\mathbf{u}^*)^T \mathbf{s}^p = 0\] This means that the vector of optimal dual variables \(\mathbf{u}^*\) is orthogonal to the vector of primal slacks \(\mathbf{s}^p\). Similarly, since both \(\mathbf{u}^* \geq \mathbf{0}\) and \(\mathbf{s}^p \geq \mathbf{0}\), for each component \(i\), at least one of \(u_i^*\) or \(s_i^p\) must be zero.

Geometric Interpretation

This orthogonality provides a geometric perspective on complementary slackness, highlighting a perpendicular relationship between optimal variables and slacks across the primal and dual problems. In essence, the optimal primal variables are perpendicular to the dual slack vector, and the optimal dual variables are perpendicular to the primal slack vector. This geometric interpretation underscores the interconnectedness between the primal and dual problems at optimality.

Example: Verifying Optimality Using Complementary Slackness

Problem Statement: Primal LP

Consider the following primal linear program: \[\begin{aligned} \text{Maximize } & x_1 - 2x_2 + 3x_3 \\ \text{subject to } & x_1 + 2x_2 - 2x_3 \leq 1 \\ & 2x_1 - x_2 - 3x_3 \leq 4 \\ & x_1 + x_2 + 5x_3 \leq 2 \\ & x_1, x_2, x_3 \geq 0 \end{aligned}\]

Proposed Primal Solution

Suppose we are given a proposed optimal solution for the primal problem: \(\mathbf{x}^* = \left( \frac{9}{7}, 0, \frac{1}{7} \right)^T\). We want to verify if this solution is indeed optimal using complementary slackness conditions.

Verification of Primal Feasibility

First, we check if \(\mathbf{x}^*\) is primal feasible:

  1. \(x_1^* + 2x_2^* - 2x_3^* = \frac{9}{7} + 2(0) - 2\left(\frac{1}{7}\right) = \frac{9-2}{7} = \frac{7}{7} = 1 \leq 1\) (Feasible)

  2. \(2x_1^* - x_2^* - 3x_3^* = 2\left(\frac{9}{7}\right) - 0 - 3\left(\frac{1}{7}\right) = \frac{18-3}{7} = \frac{15}{7} \leq 4 = \frac{28}{7}\) (Feasible)

  3. \(x_1^* + x_2^* + 5x_3^* = \frac{9}{7} + 0 + 5\left(\frac{1}{7}\right) = \frac{9+5}{7} = \frac{14}{7} = 2 \leq 2\) (Feasible)

  4. \(x_1^* = \frac{9}{7} \geq 0, x_2^* = 0 \geq 0, x_3^* = \frac{1}{7} \geq 0\) (Feasible)

Thus, \(\mathbf{x}^*\) is primal feasible.

Derivation of the Dual LP

The dual linear program is: \[\begin{aligned} \text{Minimize } & u_1 + 4u_2 + 2u_3 \\ \text{subject to } & u_1 + 2u_2 + u_3 \geq 1 \\ & 2u_1 - u_2 + u_3 \geq -2 \\ & -2u_1 - 3u_2 + 5u_3 \geq 3 \\ & u_1, u_2, u_3 \geq 0 \end{aligned}\]

Applying Complementary Slackness Conditions

Identifying Active Dual Constraints

Since \(x_1^* = \frac{9}{7} > 0\) and \(x_3^* = \frac{1}{7} > 0\), by complementary slackness condition 4, the first and third dual constraints must be binding: $$ \[\begin{aligned} u_1 + 2u_2 + u_3 &= 1 \\ -2u_1 - 3u_2 + 5u_3 &= 3 \end{aligned}\]

$$ Since \(x_2^* = 0\), the second dual constraint can have slack, i.e., \(2u_1 - u_2 + u_3 \geq -2\).

Identifying Active Dual Variables

For the primal constraints, we calculate the slacks for \(\mathbf{x}^*\):

  1. \(s_1^p = 1 - (x_1^* + 2x_2^* - 2x_3^*) = 1 - 1 = 0\) (Binding)

  2. \(s_2^p = 4 - (2x_1^* - x_2^* - 3x_3^*) = 4 - \frac{15}{7} = \frac{28-15}{7} = \frac{13}{7} > 0\) (Non-binding)

  3. \(s_3^p = 2 - (x_1^* + x_2^* + 5x_3^*) = 2 - 2 = 0\) (Binding)

Since \(s_2^p > 0\), by complementary slackness condition 3, the corresponding dual variable \(u_2^*\) must be zero: \(u_2^* = 0\).

Solving for the Dual Solution

Substituting \(u_2^* = 0\) into the binding dual constraint equations: $$ \[\begin{aligned} u_1 + u_3 &= 1 \\ -2u_1 + 5u_3 &= 3 \end{aligned}\]

\[ From the first equation, $u_1 = 1 - u_3$. Substituting into the second equation: \]-2(1 - u_3) + 5u_3 = 3\[ \]-2 + 2u_3 + 5u_3 = 3\[ \]7u_3 = 5\[ \]u_3^* = $$ Then, \(u_1^* = 1 - u_3^* = 1 - \frac{5}{7} = \frac{2}{7}\). So, the proposed dual solution is \(\mathbf{u}^* = \left( \frac{2}{7}, 0, \frac{5}{7} \right)^T\).

Verification of Dual Feasibility and Optimality

First, check dual feasibility of \(\mathbf{u}^*\):

  1. \(u_1^* + 2u_2^* + u_3^* = \frac{2}{7} + 2(0) + \frac{5}{7} = \frac{7}{7} = 1 \geq 1\) (Feasible)

  2. \(2u_1^* - u_2^* + u_3^* = 2\left(\frac{2}{7}\right) - 0 + \frac{5}{7} = \frac{4+5}{7} = \frac{9}{7} \geq -2\) (Feasible)

  3. \(-2u_1^* - 3u_2^* + 5u_3^* = -2\left(\frac{2}{7}\right) - 3(0) + 5\left(\frac{5}{7}\right) = \frac{-4+25}{7} = \frac{21}{7} = 3 \geq 3\) (Feasible)

  4. \(u_1^* = \frac{2}{7} \geq 0, u_2^* = 0 \geq 0, u_3^* = \frac{5}{7} \geq 0\) (Feasible)

Thus, \(\mathbf{u}^*\) is dual feasible.

Finally, we check if the objective values are equal: Primal objective value: \(f(\mathbf{x}^*) = x_1^* - 2x_2^* + 3x_3^* = \frac{9}{7} - 2(0) + 3\left(\frac{1}{7}\right) = \frac{9+3}{7} = \frac{12}{7}\). Dual objective value: \(g(\mathbf{u}^*) = u_1^* + 4u_2^* + 2u_3^* = \frac{2}{7} + 4(0) + 2\left(\frac{5}{7}\right) = \frac{2+10}{7} = \frac{12}{7}\). Since the primal and dual objective values are equal and both \(\mathbf{x}^*\) and \(\mathbf{u}^*\) are feasible, by strong duality (or complementary slackness theorem), \(\mathbf{x}^*\) is indeed an optimal solution for the primal problem, and \(\mathbf{u}^*\) is an optimal solution for the dual problem.

Conclusion

In this lecture, we have delved into the complementary slackness conditions, a potent tool for verifying the optimality of solutions in linear programming. We formally stated and proved the complementary slackness theorem, emphasizing its necessary and sufficient nature for optimality. We interpreted these conditions geometrically through the lens of orthogonality between primal variables and dual slacks, as well as between dual variables and primal slacks. Finally, via a detailed example, we illustrated how to employ complementary slackness to validate the optimality of a proposed primal solution and to deduce the corresponding optimal dual solution without directly solving the LP problems. Complementary slackness offers valuable insights into the relationships between primal and dual problems and constitutes a fundamental concept in linear programming theory and its applications. Further exploration could involve applying these conditions in the design of algorithms and in sensitivity analysis within the realm of optimization problems.