is the algorithmic problem of deciding whether two words in the generators represent the same element of
Thus one can speak unambiguously of the decidability of the word problem for the finitely generated group
Throughout the history of the subject, computations in groups have been carried out using various normal forms.
In 1912 he gave an algorithm that solves both the word and conjugacy problem for the fundamental groups of closed orientable two-dimensional manifolds of genus greater than or equal to 2.
[2] Subsequent authors have greatly extended Dehn's algorithm and applied it to a wide range of group theoretic decision problems.
[3][4][5] It was shown by Pyotr Novikov in 1955 that there exists a finitely presented group
[7] The word problem was one of the first examples of an unsolvable problem to be found not in mathematical logic or the theory of algorithms, but in one of the central branches of classical mathematics, algebra.
For example, polycyclic groups have solvable word problems since the normal form of an arbitrary word in a polycyclic presentation is readily computable; other algorithms for groups may, in suitable circumstances, also solve the word problem, see the Todd–Coxeter algorithm[8] and the Knuth–Bendix completion algorithm.
For instance Dehn's algorithm does not solve the word problem for the fundamental group of the torus.
In more concrete terms, the uniform word problem can be expressed as a rewriting question, for literal strings.
In fact the relations provide a list of strings that can be either introduced where we want, or cancelled out whenever we see them, without changing the 'value', i.e. the group element that is the result of the multiplication.
The result is that the word problem, here for the cyclic group of order three, is solvable.
In general, it is not true that one can get a canonical form for the elements, by stepwise cancellation.
One may have to use relations to expand a string many-fold, in order eventually to find a cancellation that brings the length right down.
The upshot is, in the worst case, that the relation between strings that says they are equal in
such that: The following will be proved as an example of the use of this technique: Proof: Suppose
The criterion given above, for the solvability of the word problem in a single group, can be extended by a straightforward argument.
It is easy to think that this demonstrates a uniform solution of the word problem for the class
If this were the case, the non-existence of a universal solvable word problem group would follow easily from Boone-Rogers.
If there were a recursive function that mapped (finitely generated) presentations of groups in
But there is no reason, in general, to suppose that such a recursive function exists.
is finite, it is possible to distinguish between homomorphisms and non-homomorphisms, by using the solution to the word problem in
There are a number of results that relate solvability of the word problem and algebraic structure.
The most significant of these is the Boone-Higman theorem: It is widely believed that it should be possible to do the construction so that the simple group itself is finitely presented.
If so one would expect it to be difficult to prove as the mapping from presentations to simple groups would have to be non-recursive.
The following has been proved by Bernhard Neumann and Angus Macintyre: What is remarkable about this is that the algebraically closed groups are so wild that none of them has a recursive presentation.
The oldest result relating algebraic structure to solvability of the word problem is Kuznetsov's theorem: To prove this let
Therefore: The existence of such a function is sufficient to prove the word problem is solvable for
This proof does not prove the existence of a uniform algorithm for solving the word problem for this class of groups.
The non-uniformity resides in choosing a non-trivial element of the simple group.