Bird–Meertens formalism

The Bird–Meertens formalism (BMF) is a calculus for deriving programs from program specifications (in a functional programming setting) by a process of equational reasoning. It was devised by Richard Bird and Lambert Meertens as part of their work within IFIP Working Group 2.1.

It is sometimes referred to in publications as BMF, as a nod to Backus–Naur form. Facetiously it is also referred to as Squiggol, as a nod to ALGOL, which was also in the remit of WG 2.1, and because of the "squiggly" symbols it uses. A less-used variant name, but actually the first one suggested, is SQUIGOL. Martin and Nipkow provided automated support for Squiggol development proofs, using the Larch Prover.

Basic examples and notations
Map is a well-known second-order function that applies a given function to every element of a list; in BMF, it is written $$*$$:


 * $$f*[e_1,\dots,e_n] = [f\ e_1,\dots,f\ e_n].$$

Likewise, reduce is a function that collapses a list into a single value by repeated application of a binary operator. It is written / in BMF. Taking $$\oplus$$ as a suitable binary operator with neutral element e, we have


 * $$\oplus / [e_1,\dots,e_n] = e \oplus e_1 \oplus \dots \oplus e_n.$$

Using those two operators and the primitives $$+$$ (as the usual addition), and $$+\!\!\!+$$ (for list concatenation), we can easily express the sum of all elements of a list, and the flatten function, as $${\rm sum} = + /$$ and $${\rm flatten} = +\!\!\!+ /$$, in point-free style. We have:


 * $${\rm sum}\ [e_1,\dots,e_n] = + / [e_1,\dots,e_n] = 0 +  e_1 + \dots+ e_n = \sum_k e_k.$$
 * $${\rm flatten}\ [l_1,\dots,l_n] =+\!\!\!+ / [l_1,\dots,l_n] =  [\,] +\!\!\!+\;  l_1 +\!\!\!+ \dots+\!\!\!+\; l_n = \text{ the concatenation of all lists } l_k.$$

Similarly, writing $$\cdot$$ for functional composition and $$\land$$ for conjunction, it is easy to write a function testing that all elements of a list satisfy a predicate p, simply as $${\rm all}\ p = (\land /)\cdot(p*)$$:



\begin{align} {\rm all}\ p\ [e_1,\dots,e_n] &= (\land /)\cdot(p*)\ [e_1,\dots,e_n] \\&= \land /(p* [e_1,\dots,e_n]) \\&= \land /[p\ e_1,\dots,p\ e_n] \\&= p\ e_1\land \dots \land p\ e_n \\&= \forall k\. \ p\ e_k. \end{align}$$

Bird (1989) transforms inefficient easy-to-understand expressions ("specifications") into efficient involved expressions ("programs") by algebraic manipulation. For example, the specification "$$\mathrm{max} \cdot \mathrm{map} \; \mathrm{sum} \cdot \mathrm{segs}$$" is an almost literal translation of the maximum segment sum problem, but running that functional program on a list of size $$n$$ will take time $$\mathcal{O}(n^3)$$ in general. From this, Bird computes an equivalent functional program that runs in time $$\mathcal{O}(n)$$, and is in fact a functional version of Kadane's algorithm.

The derivation is shown in the picture, with computational complexities given in blue, and law applications indicated in red. Example instances of the laws can be opened by clicking on [show]; they use lists of integer numbers, addition, minus, and multiplication. The notation in Bird's paper differs from that used above: $$\mathrm{map}$$, $$\mathrm{concat}$$, and $$\mathrm{foldl}$$ correspond to $$*$$, $$\mathrm{flatten}$$, and a generalized version of $$/$$ above, respectively, while $$\mathrm{inits}$$ and $$\mathrm{tails}$$ compute a list of all prefixes and suffixes of its arguments, respectively. As above, function composition is denoted by "$$\cdot$$", which has lowest binding precedence. In the example instances, lists are colored by nesting depth; in some cases, new operations are defined ad hoc (grey boxes).

The homomorphism lemma and its applications to parallel implementations
A function h on lists is called a list homomorphism if there exists an associative binary operator $$\oplus$$ and neutral element $$e$$ such that the following holds:



\begin{align} &h\ [\,] &&=\ e \\ &h\ (l +\!\!\!+\; m) &&=\ h\ l \oplus h\ m. \end{align} $$

The homomorphism lemma states that h is a homomorphism if and only if there exists an operator $$\oplus$$ and a function f such that $$h = (\oplus/)\cdot(f*)$$.

A point of great interest for this lemma is its application to the derivation of highly parallel implementations of computations. Indeed, it is trivial to see that $$f*$$ has a highly parallel implementation, and so does $$\oplus/$$ — most obviously as a binary tree. Thus for any list homomorphism h, there exists a parallel implementation. That implementation cuts the list into chunks, which are assigned to different computers; each computes the result on its own chunk. It is those results that transit on the network and are finally combined into one. In any application where the list is enormous and the result is a very simple type – say an integer – the benefits of parallelisation are considerable. This is the basis of the map-reduce approach.