Reduction (complexity)

"Harder" means having a higher estimate of the required computational resources in a given context (e.g., higher time complexity, greater memory requirement, expensive need for extra hardware processor cores for a parallel solution compared to a single-threaded solution, etc.).

However, the reduction becomes much harder if we add the restriction that we can only use the squaring function one time, and only at the end.

Using this limited form of reduction, we have shown the unsurprising result that multiplication is harder in general than squaring.

A reduction is a preordering, that is a reflexive and transitive relation, on P(N)×P(N), where P(N) is the power set of the natural numbers.

Likewise, a reduction computing a noncomputable function can reduce an undecidable problem to a decidable one.

As Michael Sipser points out in Introduction to the Theory of Computation: "The reduction must be easy, relative to the complexity of typical problems in the class [...] If the reduction itself were difficult to compute, an easy solution to the complete problem wouldn't necessarily yield an easy solution to the problems reducing to it."

In case of optimization (maximization or minimization) problems, we often think in terms of approximation-preserving reduction.

The following example shows how to use reduction from the halting problem to prove that a language is undecidable.

Suppose H(M, w) is the problem of determining whether a given Turing machine M halts (by accepting or rejecting) on input string w. This language is known to be undecidable.

Example of a reduction from the boolean satisfiability problem ( A B ) ∧ (¬ A ∨ ¬ B ∨ ¬ C ) ∧ (¬ A B C ) to a vertex cover problem . The blue vertices form a minimum vertex cover, and the blue vertices in the gray oval correspond to a satisfying truth assignment for the original formula.