Variable neighborhood search

Variable neighborhood search (VNS),[1] proposed by Mladenović & Hansen in 1997,[2] is a metaheuristic method for solving a set of combinatorial optimization and global optimization problems.

It explores distant neighborhoods of the current incumbent solution, and moves from there to a new one if and only if an improvement was made.

VNS systematically changes the neighborhood in two phases: firstly, descent to find a local optimum and finally, a perturbation phase to get out of the corresponding valley.

Applications are rapidly increasing in number and pertain to many fields: location theory, cluster analysis, scheduling, vehicle routing, network design, lot-sizing, artificial intelligence, engineering, pooling problems, biology, phylogeny, reliability, geometry, telecommunication design, etc.

[5] Earlier work that motivated this approach can be found in Recent surveys on VNS methodology as well as numerous applications can be found in 4OR, 2008[10] and Annals of OR, 2010.

If S is a finite but large set, a combinatorial optimization problem is defined.

Exact algorithm for problem (1) is to be found an optimal solution x*, with the validation of its optimal structure, or if it is unrealizable, in procedure have to be shown that there is no achievable solution, i.e.,

For continuous optimization, it is reasonable to allow for some degree of tolerance, i.e., to stop when a feasible solution

Heuristics are faced with the problem of local optima as a result of avoiding boundless computing time.

According to (Mladenović, 1995), VNS is a metaheuristic which systematically performs the procedure of neighborhood change, both in descent to local minima and in escape from the valleys which contain them.

Therefore, in addition to providing very good solutions, often in simpler ways than other methods, VNS gives insight into the reasons for such a performance, which, in turn, can lead to more efficient and sophisticated implementations.

There are several papers where it could be studied among recently mentioned, such as (Hansen and Mladenović 1999, 2001a, 2003, 2005; Moreno-Pérez et al.;[11]) A local search heuristic is performed through choosing an initial solution x, discovering a direction of descent from x, within a neighborhood N(x), and proceeding to the minimum of f(x) within N(x) in the same direction.

Usually the highest direction of descent, also related to as best improvement, is used.

This set of rules is summarized in Algorithm 1, where we assume that an initial solution x is given.

As this may be time-consuming, an alternative is to use the first descent heuristic.

are then enumerated systematically and a move is made as soon as a direction for the descent is found.

We call x' ∈ X a local minimum of problem with respect to

In order to solve problem by using several neighborhoods, facts 1–3 can be used in three different ways: (i) deterministic; (ii) stochastic; (iii) both deterministic and stochastic.

We first give in Algorithm 3 the steps of the neighborhood change function which will be used later.

Function NeighborhoodChange() compares the new value f(x') with the incumbent value f(x) obtained in the neighborhood k (line 1).

If an improvement is obtained, k is returned to its initial value and the new incumbent updated (line 2).

When VNS does not render a good solution, there are several steps which could be helped in process, such as comparing first and best improvement strategies in local search, reducing neighborhood, intensifying shaking, adopting VND, adopting FSS, and experimenting with parameter settings.

The Basic VNS (BVNS) method (Handbook of Metaheuristics, 2010)[3] combines deterministic and stochastic changes of neighborhood.

Observe that point x' is generated at random in Step 4 in order to avoid cycling, which might occur if a deterministic rule were applied.

In Step 5, the best improvement local search (Algorithm 1) is usually adopted.

The basic VNS is a best improvement descent method with randomization.

[3] Without much additional effort, it can be transformed into a descent-ascent method: in NeighbourhoodChange() function, replace also x by x" with some probability, even if the solution is worse than the incumbent.

In paper (Fleszar and Hindi[12]) could be found algorithm.

Some fields where it could be found collections of scientific papers: VNS implies several features which are presented by Hansen and Mladenović[22] and some are presented here: Interest in VNS is growing quickly, evidenced by the increasing number of papers published each year on this topic (10 years ago, only a few; 5 years ago, about a dozen; and about 50 in 2007).

Moreover, the 18th EURO mini-conference held in Tenerife in November 2005 was entirely devoted to VNS.