In constraint satisfaction, local search is an incomplete method for finding a solution to a problem.
It is based on iteratively improving an assignment of the variables until all constraints are satisfied.
In particular, local search algorithms typically modify the value of a variable in an assignment at each step.
The aim of local search is that of finding an assignment of minimal cost, which is a solution if any exists.
These algorithms proceed by changing the current assignment by always trying to decrease (or at least, non-increase) its cost.
The main problem of these algorithms is the possible presence of plateaus, which are regions of the space of assignments where no local move decreases cost.
The second class of local search algorithm have been invented to solve this problem.
The most basic form of local search is based on choosing the change that maximally decreases the cost of the solution.
If no solution has been found after a given number of changes, a new random assignment is selected.
As a result, they can be stuck in a plateau where the quality of assignment has a local maxima.
GSAT (greedy sat) was the first local search algorithm for satisfiability, and is a form of hill climbing.
Moreover, the weight of constraints that remain violated for a large number of moves keeps increasing.
More precisely, it contains the last variable-value pair such that the variable has been recently assigned to the value.
If a variable-value pair is in the tabu list, then changing the current assignment by setting the variable to the value is forbidden.
For propositional satisfiability of conjunctive normal form formulae, which is the original settings of this algorithm, every such a move changes the value of the variable from true to false or vice versa, and produce the satisfiability of the violated constraint.
The technique of simulated annealing is based on changing the probability of doing a random move over one that maximally decreasing the cost.
In particular, the name originates from the strategy of decreasing the probability of doing random moves during the execution of the algorithm, thus virtually "freezing" the space of search.
Local search usually works on all variables, improving a complete assignment to them.
A proposed algorithm works on a cycle cutset, which is a set of variables that, if removed from the problem, makes it acyclic.
For any assignment of the variables of the cutset, the remaining problem has a forest as primal graph.
In order to guide local search, an algorithm detecting the minimal number of constraints that can be violated is used in place of a satisfiability algorithm on the for forest part of the problem.
This minimal number is found by determining the cost of each variable assignment.
With these assumptions, the above formula allows computing the cost of all variable evaluations by iteratively proceeding bottom-up from the leaves to the root(s) of the forest.