Operator grammar

This theory is the culmination of the life work of Zellig Harris, with major publications toward the end of the last century.

The theory is consistent with the idea that language evolved gradually, with each successive generation introducing new complexity and variation.

Together these provide a theory of language information: dependency builds a predicate–argument structure; likelihood creates distinct meanings; reduction allows compact forms for communication.

The categories in operator grammar are universal and are defined purely in terms of how words relate to other words, and do not rely on an external set of categories such as noun, verb, adjective, adverb, preposition, conjunction, etc.

The dependency constraint creates a structure (syntax) in which any word of the appropriate class can be an argument for a given operator.

The likelihood constraint places additional restrictions on this structure by making some operator/argument combinations more likely than others.

The reduction constraint acts on high likelihood combinations of operators and arguments and makes more compact forms.

Certain reductions reduce words to shorter forms, creating pronouns, suffixes and prefixes (morphology).

Modifiers are the result of several of these kinds of reductions, which give rise to adjectives, adverbs, prepositional phrases, subordinate clauses, etc.

Each operator in a sentence makes a contribution in information according to its likelihood of occurrence with its arguments.

The precise contribution of an operator is determined by its selection, the set of words with which it occurs with high frequency.

Because this process is based on high frequency usage, the meanings of words are relatively stable over time, but can change in accordance with the needs of a linguistic community.