Schaefer's dichotomy theorem

In computational complexity theory, a branch of computer science, Schaefer's dichotomy theorem, proved by Thomas Jerome Schaefer, states necessary and sufficient conditions under which a finite set S of relations over the Boolean domain yields polynomial-time or NP-complete problems when the relations of S are used to constrain some of the propositional variables.

[1] It is called a dichotomy theorem because the complexity of the problem defined by S is either in P or is NP-complete, as opposed to one of the classes of intermediate complexity that is known to exist (assuming P ≠ NP) by Ladner's theorem.

Special cases of Schaefer's dichotomy theorem include the NP-completeness of SAT (the Boolean satisfiability problem) and its two popular variants 1-in-3 SAT and not-all-equal 3SAT (often denoted by NAE-3SAT).

In fact, for these two variants of SAT, Schaefer's dichotomy theorem shows that their monotone versions (where negations of variables are not allowed) are also NP-complete.

is a finite set of relations over the binary domain

An instance of the problem is an S-formula, i.e. a conjunction of constraints of the form

The problem is to determine whether the given formula is satisfiable, in other words if the variables can be assigned values such that they satisfy all the constraints as given by the relations from S. Schaefer identifies six classes of sets of Boolean relations for which SAT(S) is in P and proves that all other sets of relations generate an NP-complete problem.

A finite set of relations S over the Boolean domain defines a polynomial-time computable satisfiability problem if any one of the following conditions holds: Otherwise, the problem SAT(S) is NP-complete.

A modern, streamlined presentation of Schaefer's theorem is given in an expository paper by Hubie Chen.

For Schaefer's dichotomy theorem, the most important concept in universal algebra is that of a polymorphism.

This definition allows for the algebraic formulation of Schaefer's dichotomy theorem.

Let Γ be a finite constraint language over the Boolean domain.

In this formulation, it is easy to check if any of the tractability conditions hold.

A relation R is called primitive positive definable, or short pp-definable, from a set Γ of relations if R(v1, ... , vk) ⇔ ∃x1 ... xm.

C holds for some conjunction C of constraints from Γ and equations over the variables {v1,...,vk, x1,...,xm}.

For example, if Γ consists of the ternary relation nae(x,y,z) holding if x,y,z are not all equal, and R(x,y,z) is x∨y∨z, then R can be pp-defined by R(x,y,z) ⇔ ∃a.

The set of all relations that are pp-definable from Γ is denoted by ≪Γ≫.

Pol and Inv together form an antitone Galois connection.

As a consequence, two relation sets having the same polymorphisms lead to the same computational complexity.

[7] The analysis was later fine-tuned: CSP(Γ) is either solvable in co-NLOGTIME, L-complete, NL-complete, ⊕L-complete, P-complete or NP-complete, and given Γ, one can decide in polynomial time which of these cases holds.

[9] If the problem is to count the number of solutions, which is denoted by #CSP(Γ), then there is a similar result for the binary domain by Creignou and Hermann.

[10] Specifically, a finite set of relations S over the Boolean domain defines a polynomial time computable satisfiability problem if every relation in S is equivalent to a conjunction of affine formulae.

[2] For larger domains, a necessary condition for polynomial-time satisfiability was given by Bulatov and Dalmau.

[11] Let Γ be a finite constraint language over the Boolean domain.

An example of a Mal'tsev operation is the Minority operation given in the modern, algebraic formulation of Schaefer's dichotomy theorem above.

There are a total of 4 Mal'tsev operations on Boolean variables, determined by the values of

On other domains, such as groups, examples of Mal'tsev operations include

For larger domains, even for a domain of size three, the existence of a Mal'tsev polymorphism for Γ is an insufficient condition for the tractability of #CSP(Γ).