Optimality theory

Optimality theory has its origin in a talk given by Alan Prince and Paul Smolensky in 1991[1] which was later developed in a book manuscript by the same authors in 1993.

It arose in part as an alternative to the connectionist theory of harmonic grammar, developed in 1990 by Géraldine Legendre, Yoshiro Miyata and Paul Smolensky.

Variants of OT with connectionist-like weighted constraints continue to be pursued in more recent work (Pater 2009).

Languages without complex clusters differ on how they will resolve this problem; some will epenthesize (e.g. [falasak], or [falasaka] if all codas are banned) and some will delete (e.g. [fas], [fak], [las], [lak]).

The grammar (ranking of constraints) of the language determines which of the candidates will be assessed as optimal by Eval.

However, it may not be possible to distinguish all of these potential grammars, since not every constraint is guaranteed to have an observable effect in every language.

Max and Dep replace Parse and Fill proposed by Prince and Smolensky (1993), which stated "underlying segments must be parsed into syllable structure" and "syllable positions must be filled with underlying segments", respectively.

[9] This stems from the model adopted by Prince and Smolensky known as containment theory, which assumes the input segments unrealized by the output are not removed but rather "left unparsed" by a syllable.

[10] The model put forth by McCarthy and Prince (1995, 1999), known as correspondence theory, has since replaced it as the standard framework.

[13][14] Local conjunctions are used as a way of circumventing the problem of phonological opacity that arises when analyzing chain shifts.

[15] For example, given the constraints C1, C2, and C3, where C1 dominates C2, which dominates C3 (C1 ≫ C2 ≫ C3), A beats B, or is more harmonic than B, if A has fewer violations than B on the highest ranking constraint which assigns them a different number of violations (A is "optimal" if A beats B and the candidate set comprises only A and B).

An early example proposed by McCarthy and Prince (1994) is the constraint NoCoda, which prohibits syllables from ending in consonants.

In Balangao, NoCoda is not ranked high enough to be always obeyed, as witnessed in roots like taynan (faithfulness to the input prevents deletion of the final /n/).

Brasoveanu and Prince (2005) describe a process known as fusion and the various ways of presenting data in a comparative tableau in order to achieve the necessary and sufficient conditions for a given argument.

The first row reveals that either *SS or Agree must dominate Dep, based on the comparison between [dɪʃɪz] and [dɪʃz].

These sorts of problems are the reason why most linguists utilize a lattice graph to represent necessary and sufficient rankings, as shown below.

Optimality theory has attracted substantial amounts of criticism, most of which is directed at its application to phonology (rather than syntax or other fields).

For example, in Quebec French, high front vowels triggered affrication of /t/, (e.g. /tipik/ → [tˢpɪk]), but the loss of high vowels (visible at the surface level) has left the affrication with no apparent source.

Such counterbleeding rule orderings are therefore termed opaque (as opposed to transparent), because their effects are not visible at the surface level.

Idsardi (2006) argues this position, though other linguists dispute this claim on the grounds that Idsardi makes unreasonable assumptions about the constraint set and candidates, and that more moderate instantiations of OT do not present such significant computational problems (see Kornai (2006) and Heinz, Kobele and Riggle (2009)).

The source of this issue may be in terminology: the term theory is used differently here than in physics, chemistry, and other sciences.

What predictions are made, and whether they are testable, depends on the specifics of individual proposals (most commonly, this is a matter of the definitions of the constraints used in an analysis).

[28][irrelevant citation] In practice, implementations of OT often make use of many concepts of phonological theories of representations, such as the syllable, the mora, or feature geometry.

Optimality theory is most commonly associated with the field of phonology, but has also been applied to other areas of linguistics.

Jane Grimshaw, Geraldine Legendre and Joan Bresnan have developed instantiations of the theory within syntax.

[34] For orthography, constraint-based analyses have also been proposed, among others, by Richard Wiese[35] and Silke Hamann/Ilaria Colombo.