Lecture Notes on Duality in Linear Programming: Rules and Example
Introduction
This lecture focuses on deriving the dual of a linear programming problem. We will establish a set of rules for efficiently calculating the dual, designed for straightforward application and memorization. These rules will cover primal problems with inequality and equality constraints, as well as non-negative and free variables. We assume inequalities in the primal are oriented as follows: \(\leq\) for maximization and \(\geq\) for minimization problems. A detailed example will illustrate these rules.
Complementarity Conditions
Complementarity conditions establish relationships between primal and dual variables and constraints at optimality. They are crucial for understanding the connection between a linear program and its dual. These conditions state that at optimality, for each primal variable and its corresponding dual constraint, and for each primal constraint and its corresponding dual variable, at least one of the following must hold:
1. **Primal Variable and Dual Constraint:** * If a primal variable is strictly positive, then the corresponding dual constraint must be binding (hold with equality). * If a dual constraint is not binding (holds with strict inequality), then the corresponding primal variable must be zero.
2. **Primal Constraint and Dual Variable:** * If a primal constraint is not binding (holds with strict inequality), then the corresponding dual variable must be zero. * If a dual variable is strictly positive, then the corresponding primal constraint must be binding (hold with equality).
These conditions can be summarized as follows, where \(x_j\) represents the \(j\)-th primal variable, \(u_i\) represents the \(i\)-th dual variable, and the inequalities represent the corresponding constraints:
* \(x_j > 0 \implies \sum_{i=1}^{M} a_{ij} u_i = c_j\) (or the corresponding inequality for minimization problems) * \(\sum_{i=1}^{M} a_{ij} u_i < c_j\) (or \(>\) for minimization) \(\implies x_j = 0\) * \(u_i > 0 \implies \sum_{j=1}^{N} a_{ij} x_j = b_i\) (or the corresponding inequality) * \(\sum_{j=1}^{N} a_{ij} x_j < b_i\) (or \(>\) for minimization) \(\implies u_i = 0\)
**In essence, complementarity conditions imply a form of "either-or" relationship between primal variables and their corresponding dual constraints, and between primal constraints and their corresponding dual variables at optimality.** They are a powerful tool for analyzing optimal solutions and understanding the interplay between primal and dual problems.
Rules for Calculating the Dual of a Linear Program
We will now outline a set of rules to mechanically derive the dual (D) of a generic primal linear program (P). These rules are applicable when the primal problem P includes both inequality and equality constraints, and both non-negative and free variables. We assume that the inequalities in P are in the "correct" direction: \(\leq\) for maximization problems and \(\geq\) for minimization problems.
General Principles
The fundamental relationship between the primal and dual problems involves transformations in the objective function, variables, and constraints. The core principle is a systematic exchange and conversion based on the structure of the primal problem.
Objective Function Transformation
If the primal problem (P) is a maximization problem, then the dual problem (D) will be a minimization problem.
Conversely, if the primal problem (P) is a minimization problem, then the dual problem (D) will be a maximization problem.
In essence, the optimization direction is reversed when moving from the primal to the dual.
Primal and Dual Dimensions
Consider a primal problem (P) with \(N\) variables and \(M\) constraints. Then, the dual problem (D) will have:
\(M\) dual variables (typically denoted as \(u_1, u_2, \ldots, u_M\)). The number of dual variables is equal to the number of primal constraints.
\(N\) dual constraints. The number of dual constraints is equal to the number of primal variables (typically denoted as \(x_1, x_2, \ldots, x_N\)).
Cost and Right-Hand Side Vector Exchange
There is a crucial exchange between the cost vector and the right-hand side vector when transitioning from the primal to the dual:
The cost vector of the primal problem (coefficients of the objective function) becomes the right-hand side vector of the dual problem’s constraints.
The right-hand side vector of the primal problem’s constraints becomes the cost vector of the dual problem’s objective function.
Constructing Dual Constraints from Primal Variables
For each variable in the primal problem, we construct a corresponding constraint in the dual problem. The type of dual constraint depends on the nature of the primal variable.
Non-negative Primal Variables
If a primal variable \(x_j\) is constrained to be non-negative (\(x_j \geq 0\)), then the \(j\)-th dual constraint will be an inequality. This inequality is constructed by taking the \(j\)-th column of the primal constraint matrix and forming a linear combination with the dual variables \(u_i\).
Specifically, for each non-negative primal variable \(x_j\), we have a dual constraint of the form:
\[\sum_{i=1}^{M} a_{ij} u_i \begin{cases} \leq c_j & \text{if dual is a maximization problem} \\ \geq c_j & \text{if dual is a minimization problem} \end{cases}\]
where \(a_{ij}\) are the coefficients from the \(j\)-th column of the primal constraint matrix, and \(c_j\) is the \(j\)-th coefficient of the primal cost vector. The direction of the inequality (\(\leq\) or \(\geq\)) is chosen such that it is consistent with the optimization direction of the dual problem ("in the right direction").
Free Primal Variables
If a primal variable \(x_j\) is free (unrestricted in sign), then the \(j\)-th dual constraint will be an equality. Similar to the non-negative case, this equality is formed using the \(j\)-th column of the primal constraint matrix and the dual variables.
Specifically, for each free primal variable \(x_j\), we have a dual constraint of the form:
\[\sum_{i=1}^{M} a_{ij} u_i = c_j\]
where \(a_{ij}\) and \(c_j\) are defined as in the non-negative variable case.
Constructing Dual Variables from Primal Constraints
For each constraint in the primal problem, we introduce a corresponding variable in the dual problem. The nature of the dual variable depends on the type of the primal constraint.
Inequality Primal Constraints
If the \(i\)-th primal constraint is an inequality, then the corresponding dual variable \(u_i\) will be non-negative (\(u_i \geq 0\)). This applies to both \(\leq\) and \(\geq\) type inequalities, assuming they are in the "correct" direction for the primal problem type (i.e., \(\leq\) for maximization, \(\geq\) for minimization).
Equality Primal Constraints
If the \(i\)-th primal constraint is an equality, then the corresponding dual variable \(u_i\) will be free (unrestricted in sign).
Summary of Rules:
| Primal (P) | Dual (D) | |
|---|---|---|
| Maximization | \(\longleftrightarrow\) | Minimization |
| Minimization | \(\longleftrightarrow\) | Maximization |
| \(N\) variables | \(\longleftrightarrow\) | \(N\) constraints |
| \(M\) constraints | \(\longleftrightarrow\) | \(M\) variables |
| Cost vector \(c\) | \(\longleftrightarrow\) | RHS vector \(c\) |
| RHS vector \(b\) | \(\longleftrightarrow\) | Cost vector \(b\) |
| \(x_j \geq 0\) | \(\longrightarrow\) | \(\sum_{i=1}^{M} a_{ij} u_i \begin{cases} \leq c_j & \text{if D is max} \\ \geq c_j & \text{if D is min} \end{cases}\) |
| \(x_j\) free | \(\longrightarrow\) | \(\sum_{i=1}^{M} a_{ij} u_i = c_j\) |
| \(i\)-th constraint \(\leq\) | \(\longleftarrow\) | \(u_i \geq 0\) |
| \(i\)-th constraint \(\geq\) | \(\longleftarrow\) | \(u_i \geq 0\) |
| \(i\)-th constraint \(=\) | \(\longleftarrow\) | \(u_i\) free |
Remark. Remark 1. By applying these rules systematically, one can derive the dual of any linear program. It is crucial to ensure the primal problem’s inequalities are in the standard form (\(\leq\) for maximization, \(\geq\) for minimization) before applying these rules. If not, adjustments like multiplying inequalities by -1 might be necessary, as demonstrated in the example below.
Example of Dual Calculation
Let’s illustrate the rules with an example.
Defining the Primal Problem
Consider the following primal problem (P) in minimization form:
\[\begin{aligned} \text{Minimize } & 3x_1 - x_3 + 4x_5 + 2x_6 \\ \text{Subject to: } & \\ & x_1 + 2x_2 - x_4 \leq -2 \\ & 3x_1 + x_2 + x_3 - x_5 + x_6 = 1 \\ & 3x_3 + x_4 - x_6 \geq 2 \\ & x_1 - x_2 + 2x_6 = 0 \\ & 2x_2 + x_3 + x_5 \geq -1 \\ & x_1, x_3, x_5, x_6 \geq 0 \\ & x_2, x_4 \text{ are free} \end{aligned}\]
The primal problem has 6 variables (\(x_1\) to \(x_6\)) and 5 constraints. Variables \(x_1, x_3, x_5, x_6\) are non-negative, while \(x_2, x_4\) are free.
Adjusting Primal Inequalities for Minimization
For a minimization problem, we need all inequality constraints to be in the "greater than or equal to" (\(\geq\)) form. The first constraint is in the "less than or equal to" (\(\leq\)) form, so we multiply it by -1 to reverse the inequality: Now, the primal problem in the required format is:
\[\begin{aligned} \text{Minimize } & 3x_1 - x_3 + 4x_5 + 2x_6 \\ \text{Subject to: } & \\ & -x_1 - 2x_2 + x_4 \geq 2 \quad &(1)\\ & 3x_1 + x_2 + x_3 - x_5 + x_6 = 1 \quad &(2)\\ & 3x_3 + x_4 - x_6 \geq 2 \quad &(3)\\ & x_1 - x_2 + 2x_6 = 0 \quad &(4)\\ & 2x_2 + x_3 + x_5 \geq -1 \quad &(5)\\ & x_1, x_3, x_5, x_6 \geq 0 \\ & x_2, x_4 \text{ are free} \end{aligned}\]
Formulating the Dual Problem
Since the primal problem is a minimization problem, the dual problem will be a maximization problem. The primal problem has 5 constraints, so the dual problem will have 5 variables, which we denote as \(u_1, u_2, u_3, u_4, u_5\), corresponding to primal constraints (1), (2), (3), (4), and (5) respectively.
Dual Objective Function
The objective function of the dual problem is formed using the right-hand side of the primal constraints as coefficients and the dual variables. Thus, the dual objective function is:
\[\text{Maximize } 2u_1 + 1u_2 + 2u_3 + 0u_4 - 1u_5 = 2u_1 + u_2 + 2u_3 - u_5\]
Identifying Dual Variables
Now we determine the nature of the dual variables based on the primal constraints:
Constraint (1) is \(\geq\), so \(u_1 \geq 0\).
Constraint (2) is \(=\), so \(u_2\) is free.
Constraint (3) is \(\geq\), so \(u_3 \geq 0\).
Constraint (4) is \(=\), so \(u_4\) is free.
Constraint (5) is \(\geq\), so \(u_5 \geq 0\).
Thus, \(u_1 \geq 0, u_3 \geq 0, u_5 \geq 0\), and \(u_2, u_4\) are free variables.
Deriving Dual Constraints from Primal Columns
The dual constraints are derived from the columns of the primal constraint matrix and the primal cost coefficients. Since there are 6 primal variables (\(x_1\) to \(x_6\)), there will be 6 dual constraints.
Constraint 1 (from Primal Column 1 - \(x_1\))
The coefficients of \(x_1\) in the primal constraints are -1, 3, 0, 1, 0. Summing these multiplied by the corresponding dual variables, we get: \[-1u_1 + 3u_2 + 0u_3 + 1u_4 + 0u_5 = -u_1 + 3u_2 + u_4\] Since \(x_1 \geq 0\) in the primal, the first dual constraint is an inequality of type "less than or equal to" (for a maximization dual problem) and is bounded by the cost coefficient of \(x_1\) in the primal objective function, which is 3. \[-u_1 + 3u_2 + u_4 \leq 3\]
Constraint 2 (from Primal Column 2 - \(x_2\))
The coefficients of \(x_2\) are -2, 1, 0, -1, 2. Summing these multiplied by the dual variables: \[-2u_1 + 1u_2 + 0u_3 - 1u_4 + 2u_5 = -2u_1 + u_2 - u_4 + 2u_5\] Since \(x_2\) is free in the primal, the second dual constraint is an equality, bounded by the cost coefficient of \(x_2\) in the primal objective, which is 0 (since \(x_2\) is not present in the objective function). \[-2u_1 + u_2 - u_4 + 2u_5 = 0\]
Constraint 3 (from Primal Column 3 - \(x_3\))
The coefficients of \(x_3\) are 0, 1, 3, 0, 1. Summing these multiplied by the dual variables:Since \(x_3 \geq 0\) in the primal, the third dual constraint is an inequality (\(\leq\)) bounded by the cost coefficient of \(x_3\), which is -1.
Constraint 4 (from Primal Column 4 - \(x_4\))
The coefficients of \(x_4\) are 1, 0, 1, 0, 0. Summing these multiplied by the dual variables:Since \(x_4\) is free in the primal, the fourth dual constraint is an equality, bounded by the cost coefficient of \(x_4\) in the primal objective, which is 0.
Constraint 5 (from Primal Column 5 - \(x_5\))
The coefficients of \(x_5\) are 0, -1, 0, 0, 1. Summing these multiplied by the dual variables:Since \(x_5 \geq 0\) in the primal, the fifth dual constraint is an inequality (\(\leq\)) bounded by the cost coefficient of \(x_5\) in the primal objective, which is 4. \[-u_2 + u_5 \leq 4\]
Constraint 6 (from Primal Column 6 - \(x_6\))
The coefficients of \(x_6\) are 0, 1, -1, 2, 0. Summing these multiplied by the dual variables:Since \(x_6 \geq 0\) in the primal, the sixth dual constraint is an inequality (\(\leq\)) bounded by the cost coefficient of \(x_6\) in the primal objective, which is 2.
Putting it all together, the dual problem (D) is:
\[\begin{aligned} \text{Maximize } & 2u_1 + u_2 + 2u_3 - u_5 \\ \text{Subject to: } & \\ & -u_1 + 3u_2 + u_4 \leq 3 \\ & -2u_1 + u_2 - u_4 + 2u_5 = 0 \\ & u_2 + 3u_3 + u_5 \leq -1 \\ & u_1 + u_3 = 0 \\ & -u_2 + u_5 \leq 4 \\ & u_2 - u_3 + 2u_4 \leq 2 \\ & u_1 \geq 0, u_3 \geq 0, u_5 \geq 0 \\ & u_2, u_4 \text{ are free} \end{aligned}\]
This completes the derivation of the dual problem from the given primal problem using the mechanical rules.
Conclusion
This lecture presented a set of mechanical rules for deriving the dual of a linear programming problem. These rules provide a systematic approach to transforming a primal problem into its dual, highlighting the relationships between objective functions, constraints, and variable types. We observed that non-negative primal variables correspond to inequality dual constraints, while free primal variables correspond to equality dual constraints. Similarly, inequality primal constraints lead to non-negative dual variables, and equality primal constraints lead to free dual variables. The detailed example demonstrated the application of these rules, including the necessary adjustments to primal inequalities. Mastering these rules is essential for working with duality in linear programming and solving optimization problems. Further exploration of duality properties, such as strong and weak duality, will build upon these foundational concepts.