In mathematics, the relaxation of a (mixed) integer linear program is the problem that arises by removing the integrality constraint of each variable.
Then a valid cover can be described by an assignment of values to the indicator variables satisfying the constraints (that is, only the specified indicator variable values are allowed) and, for each element ej of the union of F, (that is, each element is covered).
However, there is a fractional solution in which each set is assigned the weight 1/2, and for which the total value of the objective function is 3/2.
However, this is generally not true, except for some special cases (e.g. problems with totally unimodular matrix specifications.)
Linear programming relaxation is a standard technique for designing approximation algorithms for hard optimization problems.
In this application, an important concept is the integrality gap, the maximum ratio between the solution quality of the integer program and of its relaxation.
This is because an approximation algorithm relies on some rounding strategy that finds, for every relaxed solution of size
For the set cover problem, Lovász proved that the integrality gap for an instance with n elements is Hn, the nth harmonic number.
One can turn the linear programming relaxation for this problem into an approximate solution of the original unrelaxed set cover instance via the technique of randomized rounding.
Thus, this technique leads to a randomized approximation algorithm that finds a set cover within a logarithmic factor of the optimum.
As Young showed in 1995[3] both the random part of this algorithm and the need to construct an explicit solution to the linear programming relaxation may be eliminated using the method of conditional probabilities, leading to a deterministic greedy algorithm for set cover, known already to Lovász, that repeatedly selects the set that covers the largest possible number of remaining uncovered elements.
[4] Similar randomized rounding techniques, and derandomized approximation algorithms, may be used in conjunction with linear programming relaxation to develop approximation algorithms for many other problems, as described by Raghavan, Tompson, and Young.
As well as its uses in approximation, linear programming plays an important role in branch and bound algorithms for computing the true optimum solution to hard optimization problems.
Although it is difficult to prove theoretical bounds on the performance of algorithms of this type, they can be very effective in practice.
Two 0–1 integer programs that are equivalent, in that they have the same objective function and the same set of feasible solutions, may have quite different linear programming relaxations: a linear programming relaxation can be viewed geometrically, as a convex polytope that includes all feasible solutions and excludes all other 0–1 vectors, and infinitely many different polytopes have this property.
The cutting-plane method for solving 0–1 integer programs, first introduced for the traveling salesman problem by Dantzig, Fulkerson, and Johnson in 1954[5] and generalized to other integer programs by Gomory in 1958,[6] takes advantage of this multiplicity of possible relaxations by finding a sequence of relaxations that more tightly constrain the solution space until eventually an integer solution is obtained.
Otherwise, an additional linear constraint (a cutting plane or cut) is found that separates the resulting fractional solution from the convex hull of the integer solutions, and the method repeats on this new more tightly constrained problem.
Much research has been performed on methods for finding these facets for different types of combinatorial optimization problems, under the framework of polyhedral combinatorics.