It was devised by Richard Bird and Lambert Meertens as part of their work within IFIP Working Group 2.1.
It is sometimes referred to in publications as BMF, as a nod to Backus–Naur form.
Facetiously it is also referred to as Squiggol, as a nod to ALGOL, which was also in the remit of WG 2.1, and because of the "squiggly" symbols it uses.
A less-used variant name, but actually the first one suggested, is SQUIGOL.
Martin and Nipkow provided automated support for Squiggol development proofs, using the Larch Prover.
[1] Map is a well-known second-order function that applies a given function to every element of a list; in BMF, it is written
: Likewise, reduce is a function that collapses a list into a single value by repeated application of a binary operator.
(for list concatenation), we can easily express the sum of all elements of a list, and the flatten function, as
for conjunction, it is easy to write a function testing that all elements of a list satisfy a predicate p, simply as
: Bird (1989) transforms inefficient easy-to-understand expressions ("specifications") into efficient involved expressions ("programs") by algebraic manipulation.
{\displaystyle \mathrm {max} \cdot \mathrm {map} \;\mathrm {sum} \cdot \mathrm {segs} }
" is an almost literal translation of the maximum segment sum problem,[6] but running that functional program on a list of size
From this, Bird computes an equivalent functional program that runs in time
, and is in fact a functional version of Kadane's algorithm.
The derivation is shown in the picture, with computational complexities[7] given in blue, and law applications indicated in red.
Example instances of the laws can be opened by clicking on [show]; they use lists of integer numbers, addition, minus, and multiplication.
The notation in Bird's paper differs from that used above:
compute a list of all prefixes and suffixes of its arguments, respectively.
In the example instances, lists are colored by nesting depth; in some cases, new operations are defined ad hoc (grey boxes).
A function h on lists is called a list homomorphism if there exists an associative binary operator
A point of great interest for this lemma is its application to the derivation of highly parallel implementations of computations.
Thus for any list homomorphism h, there exists a parallel implementation.
That implementation cuts the list into chunks, which are assigned to different computers; each computes the result on its own chunk.
It is those results that transit on the network and are finally combined into one.
In any application where the list is enormous and the result is a very simple type – say an integer – the benefits of parallelisation are considerable.