Composition of relations

Beginning with Augustus De Morgan,[3] the traditional form of reasoning by syllogism has been subsumed by relational logical expressions and their composition.

[5]: 13 The semicolon as an infix notation for composition of relations dates back to Ernst Schröder's textbook of 1895.

[6] Gunther Schmidt has renewed the use of the semicolon, particularly in Relational Mathematics (2011).

[2]: 40 [7] The use of the semicolon coincides with the notation for function composition used (mostly by computer scientists) in category theory,[8] as well as the notation for dynamic conjunction within linguistic dynamic semantics.

[10] However, the small circle is widely used to represent composition of functions

The small circle was used in the introductory pages of Graphs and Relations[5]: 18  until it was dropped in favor of juxtaposition (no infix notation).

A further variation encountered in computer science is the Z notation:

(or more generally a principal ideal domain), the category of relations internal to matrices over

The category of linear relations over the finite field

is isomorphic to the phase-free qubit ZX-calculus modulo scalars.

Finite binary relations are represented by logical matrices.

The entries of these matrices are either zero or one, depending on whether the relation represented is false or true for the row and column corresponding to compared objects.

Working with such matrices involves the Boolean arithmetic with

An entry in the matrix product of two logical matrices will be 1, then, only if the row and column multiplied have a corresponding 1.

Thus the logical matrix of a composition of relations can be found by computing the matrix product of the matrices representing the factors of the composition.

"Matrices constitute a method for computing the conclusions traditionally drawn by means of hypothetical syllogisms and sorites.

can be represented by a logical matrix, assuming rows (top to bottom) and columns (left to right) are ordered alphabetically:

contains a 1 at every position, while the reversed matrix product computes as:

hence any two languages share a nation where they both are spoken (in fact: Switzerland).

Vice versa, the question whether two given nations share a language can be answered using

forms a Boolean lattice ordered by inclusion

In the calculus of relations[16] it is common to represent the complement of a set by an overbar:

represent the converse relation, also called the transpose.

Verbally, one equivalence can be obtained from another: select the first or second factor and transpose it; then complement the other two relations and permute them.

[5]: 15–19 Though this transformation of an inclusion of a composition of relations was detailed by Ernst Schröder, in fact Augustus De Morgan first articulated the transformation as Theorem K in 1860.

With Schröder rules and complementation one can solve for an unknown relation

Just as composition of relations is a type of multiplication resulting in a product, so some operations compare to division and produce quotients.

The left residual of two relations is defined presuming that they have the same domain (source), and the right residual presumes the same codomain (range, target).

The symmetric quotient presumes two relations share a domain and a codomain.

The usual composition of two binary relations as defined here can be obtained by taking their join, leading to a ternary relation, followed by a projection that removes the middle component.