Composition of relations

In the mathematics of binary relations, the composition of relations is the forming of a new binary relation R; S from two given binary relations R and S. In the calculus of relations, the composition of relations is called relative multiplication, and its result is called a relative product. Function composition is the special case of composition of relations where all relations involved are functions.

The word uncle indicates a compound relation: for a person to be an uncle, he must be the brother of a parent. In algebraic logic it is said that the relation of Uncle ($$x U z$$) is the composition of relations "is a brother of" ($$x B y$$) and "is a parent of" ($$y P z$$). $$U = BP \quad \text{ is equivalent to: } \quad xByPz \text{ if and only if } xUz.$$

Beginning with Augustus De Morgan, the traditional form of reasoning by syllogism has been subsumed by relational logical expressions and their composition.

Definition
If $$R \subseteq X \times Y$$ and $$S \subseteq Y \times Z$$ are two binary relations, then their composition $$R; S$$ is the relation $$R; S = \{(x,z) \in X \times Z : \text{ there exists } y \in Y \text{ such that } (x,y) \in R \text{ and } (y,z) \in S\}.$$

In other words, $$R; S \subseteq X \times Z$$ is defined by the rule that says $$(x,z) \in R; S$$ if and only if there is an element $$y \in Y$$ such that $$x\,R\,y\,S\,z$$ (that is, $$(x,y) \in R$$ and $$(y,z) \in S$$).

Notational variations
The semicolon as an infix notation for composition of relations dates back to Ernst Schroder's textbook of 1895. Gunther Schmidt has renewed the use of the semicolon, particularly in Relational Mathematics (2011). The use of semicolon coincides with the notation for function composition used (mostly by computer scientists) in category theory, as well as the notation for dynamic conjunction within linguistic dynamic semantics.

A small circle $$(R \circ S)$$ has been used for the infix notation of composition of relations by John M. Howie in his books considering semigroups of relations. However, the small circle is widely used to represent composition of functions $$g(f(x)) = (g \circ f)(x)$$ which reverses the text sequence from the operation sequence. The small circle was used in the introductory pages of Graphs and Relations until it was dropped in favor of juxtaposition (no infix notation). Juxtaposition $$(RS)$$ is commonly used in algebra to signify multiplication, so too, it can signify relative multiplication.

Further with the circle notation, subscripts may be used. Some authors prefer to write $$\circ_l$$ and $$\circ_r$$ explicitly when necessary, depending whether the left or the right relation is the first one applied. A further variation encountered in computer science is the Z notation: $$\circ$$ is used to denote the traditional (right) composition, while left composition is denoted by a fat semicolon. The unicode symbols are ⨾ and ⨟.

Mathematical generalizations
Binary relations $$R \subseteq X\times Y$$ are morphisms $$R : X\to Y$$ in the category $$\mathsf{Rel}$$. In Rel the objects are sets, the morphisms are binary relations and the composition of morphisms is exactly composition of relations as defined above. The category Set of sets and functions is a subcategory of $$\mathsf{Rel}$$ where the maps $$X\to Y$$ are functions $$f:X\to Y$$.

Given a regular category $$\mathbb{X}$$, its category of internal relations $$\mathsf{Rel}(\mathbb{X})$$ has the same objects as $$\mathbb{X}$$, but now the  morphisms  $$X\to Y$$ are given by subobjects $$R\subseteq X\times Y$$ in $$\mathbb{X}$$. Formally, these are jointly monic spans between $$X$$ and $$Y$$. Categories of internal relations are allegories. In particular $$\mathsf{Rel}(\mathsf{Set})\cong \mathsf{Rel}$$. Given a field $$k$$ (or more generally a principal ideal domain), the category of relations internal to matrices over $$k$$, $$\mathsf{Rel}(\mathsf{Mat}(k))$$ has morphisms $$n\to m$$ linear subspaces $$R \subseteq k^n\oplus k^m$$. The category of linear relations over the finite field $$\mathbb{F}_2$$ is isomorphic to the phase-free qubit ZX-calculus modulo scalars.

Properties

 * Composition of relations is associative: $$R;(S;T) = (R;S);T.$$
 * The converse relation of $$R \, ; S$$ is $$(R \, ; S)^\textsf{T} = S^{\textsf{T}} \, ; R^{\textsf{T}}.$$ This property makes the set of all binary relations on a set a semigroup with involution.
 * The composition of (partial) functions (that is, functional relations) is again a (partial) function.
 * If $$R$$ and $$S$$ are injective, then $$R \, ; S$$ is injective, which conversely implies only the injectivity of $$R.$$
 * If $$R$$ and $$S$$ are surjective, then $$R \, ; S$$ is surjective, which conversely implies only the surjectivity of $$S.$$
 * The set of binary relations on a set $$X$$ (that is, relations from $$X$$ to $$X$$) together with (left or right) relation composition forms a monoid with zero, where the identity map on $$X$$ is the neutral element, and the empty set is the zero element.

Composition in terms of matrices
Finite binary relations are represented by logical matrices. The entries of these matrices are either zero or one, depending on whether the relation represented is false or true for the row and column corresponding to compared objects. Working with such matrices involves the Boolean arithmetic with $$1 + 1 = 1$$ and $$1 \times 1 = 1.$$ An entry in the matrix product of two logical matrices will be 1, then, only if the row and column multiplied have a corresponding 1. Thus the logical matrix of a composition of relations can be found by computing the matrix product of the matrices representing the factors of the composition. "Matrices constitute a method for computing the conclusions traditionally drawn by means of hypothetical syllogisms and sorites."

Heterogeneous relations
Consider a heterogeneous relation $$R \subseteq A \times B;$$ that is, where $$A$$ and $$B$$ are distinct sets. Then using composition of relation $$R$$ with its converse $$R^\textsf{T},$$ there are homogeneous relations $$R R^\textsf{T}$$ (on $$A$$) and $$R^\textsf{T} R$$ (on $$B$$).

If for all $$x \in A$$ there exists some $$y \in B,$$ such that $$x R y$$ (that is, $$R$$ is a (left-)total relation), then for all $$x, x R R^\textsf{T} x$$ so that $$R R^\textsf{T}$$ is a reflexive relation or $$I \subseteq R R^\textsf{T}$$ where I is the identity relation $$\{(x,x) : x \in A\}.$$ Similarly, if $$R$$ is a surjective relation then $$R^\textsf{T} R \supseteq I = \{(x,x) : x \in B\}.$$ In this case $$R \subseteq R R^\textsf{T} R.$$ The opposite inclusion occurs for a difunctional relation.

The composition $$\bar{R}^\textsf{T} R $$ is used to distinguish relations of Ferrer's type, which satisfy $$R \bar{R}^\textsf{T} R = R.$$

Example
Let $$A = $$ { France, Germany, Italy, Switzerland } and $$B = $$ { French, German, Italian } with the relation $$R$$ given by $$a R b$$ when $$b$$ is a national language of $$a.$$ Since both $$A$$ and $$B$$ is finite, $$R$$ can be represented by a logical matrix, assuming rows (top to bottom) and columns (left to right) are ordered alphabetically: $$\begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\  0 & 0 & 1 \\  1 & 1 & 1 \end{pmatrix}.$$

The converse relation $$R^\textsf{T}$$ corresponds to the transposed matrix, and the relation composition $$R^\textsf{T}; R$$ corresponds to the matrix product $$R^\textsf{T} R$$ when summation is implemented by logical disjunction. It turns out that the $$3 \times 3$$ matrix $$R^\textsf{T} R$$ contains a 1 at every position, while the reversed matrix product computes as: $$R R^\textsf{T} = \begin{pmatrix} 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 1 \\  0 & 0 & 1 & 1 \\  1 & 1 & 1 & 1 \end{pmatrix}.$$ This matrix is symmetric, and represents a homogeneous relation on $$A.$$

Correspondingly, $$R^\textsf{T} \, ; R$$ is the universal relation on $$B,$$ hence any two languages share a nation where they both are spoken (in fact: Switzerland). Vice versa, the question whether two given nations share a language can be answered using $$R \, ; R^\textsf{T}.$$

Schröder rules
For a given set $$V,$$ the collection of all binary relations on $$V$$ forms a Boolean lattice ordered by inclusion $$(\subseteq).$$ Recall that complementation reverses inclusion: $$A \subseteq B \text{ implies } B^{\complement} \subseteq A^{\complement}.$$ In the calculus of relations it is common to represent the complement of a set by an overbar: $$\bar{A} = A^{\complement}.$$

If $$S$$ is a binary relation, let $$S^\textsf{T}$$ represent the converse relation, also called the transpose. Then the Schröder rules are $$Q R \subseteq S \quad \text{ is equivalent to } \quad Q^\textsf{T} \bar{S} \subseteq \bar{R} \quad \text{ is equivalent to } \quad \bar{S} R^\textsf{T} \subseteq \bar{Q}.$$ Verbally, one equivalence can be obtained from another: select the first or second factor and transpose it; then complement the other two relations and permute them.

Though this transformation of an inclusion of a composition of relations was detailed by Ernst Schröder, in fact Augustus De Morgan first articulated the transformation as Theorem K in 1860. He wrote $$L M \subseteq N \text{ implies } \bar{N} M^\textsf{T} \subseteq \bar{L}.$$

With Schröder rules and complementation one can solve for an unknown relation $$X$$ in relation inclusions such as $$R X \subseteq S \quad \text{and} \quad XR \subseteq S.$$ For instance, by Schröder rule $$R X \subseteq S \text{ implies } R^\textsf{T} \bar{S} \subseteq \bar{X},$$ and complementation gives $$X \subseteq \overline{R^\textsf{T} \bar{S}},$$ which is called the left residual of $$S$$ by $$R$$.

Quotients
Just as composition of relations is a type of multiplication resulting in a product, so some operations compare to division and produce quotients. Three quotients are exhibited here: left residual, right residual, and symmetric quotient. The left residual of two relations is defined presuming that they have the same domain (source), and the right residual presumes the same codomain (range, target). The symmetric quotient presumes two relations share a domain and a codomain.

Definitions:
 * Left residual: $$A\backslash B \mathrel{:=} \overline{A^\textsf{T} \bar{B} }$$
 * Right residual: $$D/C \mathrel{:=} \overline{\bar{D} C^\textsf{T}}$$
 * Symmetric quotient: $$\operatorname{syq} (E, F) \mathrel{:=} \overline{E^\textsf{T} \bar{F} } \cap \overline{\bar{E}^\textsf{T} F}$$

Using Schröder's rules, $$A X \subseteq B$$ is equivalent to $$X \subseteq A \backslash B.$$ Thus the left residual is the greatest relation satisfying $$A X \subseteq B.$$ Similarly, the inclusion $$Y C \subseteq D$$ is equivalent to $$Y \subseteq D / C,$$ and the right residual is the greatest relation satisfying $$Y C \subseteq D.$$

One can practice the logic of residuals with Sudoku.

Join: another form of composition
A fork operator $$(<)$$ has been introduced to fuse two relations $$c : H \to A$$ and $$d : H \to B$$ into $$c \,(<)\, d : H \to A \times B.$$ The construction depends on projections $$a : A \times B \to A$$ and $$b : A \times B \to B,$$ understood as relations, meaning that there are converse relations $$a^{\textsf{T}}$$ and $$b^{\textsf{T}}.$$ Then the  of $$c$$ and $$d$$ is given by $$c\,(<)\,d ~\mathrel{:=}~ c ;a^\textsf{T} \cap\ d ;b^\textsf{T}.$$

Another form of composition of relations, which applies to general $$n$$-place relations for $$n \geq 2,$$ is the join operation of relational algebra. The usual composition of two binary relations as defined here can be obtained by taking their join, leading to a ternary relation, followed by a projection that removes the middle component. For example, in the query language SQL there is the operation Join (SQL).