Space complexity

The space complexity of an algorithm or a data structure is the amount of memory space required to solve an instance of the computational problem as a function of characteristics of the input.

It is the memory required by an algorithm until it executes completely.

etc., where n is a characteristic of the input influencing space complexity.

Analogously to time complexity classes DTIME(f(n)) and NTIME(f(n)), the complexity classes DSPACE(f(n)) and NSPACE(f(n)) are the sets of languages that are decidable by deterministic (respectively, non-deterministic) Turing machines that use

The complexity classes PSPACE and NPSPACE allow

The space hierarchy theorem states that, for all space-constructible functions

there exists a problem that can be solved by a machine with

memory space, but cannot be solved by a machine with asymptotically less than

The following containments between complexity classes hold.

Furthermore, Savitch's theorem gives the reverse containment that if

This result is surprising because it suggests that non-determinism can reduce the space necessary to solve a problem only by a small amount.

In contrast, the exponential time hypothesis conjectures that for time complexity, there can be an exponential gap between deterministic and non-deterministic complexity.

This shows another qualitative difference between time and space complexity classes, as nondeterministic time complexity classes are not believed to be closed under complementation; for instance, it is conjectured that NP ≠ co-NP.

[3][4] L or LOGSPACE is the set of problems that can be solved by a deterministic Turing machine using only

memory space with regards to input size.

Even a single counter that can index the entire

space, so LOGSPACE algorithms can maintain only a constant number of counters or other variables of similar bit complexity.

LOGSPACE and other sub-linear space complexity is useful when processing large data that cannot fit into a computer's RAM.

This class also sees use in the field of pseudorandomness and derandomization, where researchers consider the open problem of whether L = RL.

[5][6] The corresponding nondeterministic space complexity class is NL.

Auxiliary space complexity could be formally defined in terms of a Turing machine with a separate input tape which cannot be written to, only read, and a conventional working tape which can be written to.

The auxiliary space complexity is then defined (and analyzed) via the working tape.

For example, consider the depth-first search of a balanced binary tree with