m*n ~ {m;n}
That is, there is a correspondence between composition considered as a binary operator and composition considered as a container.
There are so many issues to untangle even in this simple statement. For example, in mathematics, logic and computer science it is common to take on board an ontological stance that there is a category of entities called "syntax" and a category of entities called "semantics". Actually, even this summary is really only a caricature as these categories are really roles, and it is common practice in mathematics, logic and computing to use entities that were clearly in the role of semantics -- in one account of some phenomena -- in the role of syntax in some other account of some other phenomena.
In some sense we are merely looking at two different "syntaxes" for the same "semantics" in considering the correspondence noted above. We have some notion of "combining" (written above as either '*' or '{ -; - }') "things" (written above as 'm' and 'n') and how we write this down is of less importance than our meaning, that is, our notion of combination and the combinations we intend to make with it. i confess i have been hypnotized by this idea on many occasions throughout my life, and am likely to fall under its sway again.
Of course philosophers, even modern ones, have called this ontological outlook into question. i recall being told in a cognitive science course in my undergraduate studies that Steven Stitch put forward the idea that "syntax is semantics". Who knows what he meant, but i think that we can say that in certain kinds of communication acts, especially those that involve computation (and therefore proof) or proof (and therefore computation) our only access to those entities enjoying the role of semantics is through those entities enjoying the role of syntax. We have no access, for example, to Numbers; instead we have access to numerals. Many, many things can play the role of numeral for us: Arabic numerals, Roman numerals, von Neumann numerals, Church numerals, Conway games, ... . The extent to which we put these numeric systems into correspondence we can point at something deeper.
In this sense, i think of syntax as a kind of interface. It's like what people who use iPhones see (and manipulate) when they are using them -- as opposed to what people who manufacture iPhones see when they are building them -- as opposed to what people who design smart telecommunications devices see when they are designing them. Note, however, that this situation is very, very rich and subtle. Designers certainly have users and manufacturers in mind when they "calculate" their designs. Moreover, interfaces can be "layered". One interface can be used to manipulate operations provided by another interface. Likewise, syntaxes can be layered -- which is, in part, what enables this behavior of using "semantic" entities as "syntactic" entities mentioned above. And, of course, since "syntaxes" can be layered, and they are the only means of access to "semantics" -- at least in this limited form of discourse -- it might be "syntaxes all the way down," to coin a phrase.
Be that as it may, i think we can recognize intuitively, but directly, that certain syntaxes give "approach vectors" to the semantics to which they give us access. The Church numerals enable certain ways of thinking about numerals, and make certain other ways of thinking about them more obscure much in the way that the iPhone's interface makes certain uses of that telecommunication device readily apparent, and others less so. Recognizing this is especially crucial in modern mathematical communications which make heavy use of overloading conventions to indicate and denote correspondences between different notions.
That's part of what's operating in the correspondence above. The notation m*n is common infix notation for some abstract notion of combination, it could be arithmetic addition or arithmetic multiplication, it could be the operation of a symmetry group. This notation has been more traditionally associated with operations where the combination loses information about the components, as does of addition or multiplication. After combining 15 and 4 using arithemetic multiplication, we have 60. There are a lots of ways to factor 60 into two components. In point of fact, what we have really lost information about is history. The number denoted by the numeral '60' has a unique factorization
60 = 2*2*3*5
but because this operation is associative we have no way of knowing how we arrived at the final combination starting from that state. To belabor the point
((2*2)*3)*5 = (2*2)*(3*5) = 2*(2*(3*5)) = 2*((2*3)*5) = (2*(2*3))*5
Each of these different groupings represents a different possible path to -- or a different history for -- arriving at 60 using it's prime factors -- without taking advantage of the fact that multiplication is also commutative. In this sense, '*' is playing the role of a separator in a flat list (erasing history is equivalent to erasing nested groupings). In other words, we could just as easily and readily written
{2;2;3;5}
Note well, in common mathematical language, if i had used the notation
{2,2,3,5}
this would have overlapped with common notation for sets -- which would have evoked further erasures/equivalences, namely
{2,2,3,5} = {2,3,5} = {2,5,3} = {3,2,5} = {3,5,2} = {5,2,3} = {5,3,2}
The first step would definitely have lost contact with our notion of combination as arithmetic multiplication, while the others would have been in closer correspondence, as multiplication is commutative. Again, trained practitioners of this sort of discourse would remain cautious about invoking those equivalences, but their very training would certainly bring them to mind -- suggest "things to try" in the way that the iPhone's interface suggests things to try so well that a 6 year old can operate it without a manual.
The notation {m;n}, on the other hand, evokes another set of notions and things to try for the computer scientist. Specifically, it is usually associated with a certain kind of combination of programs -- namely, sequential composition: do m, then do n. It also evokes thoughts about scope and binding, but more on that later. It does not carry with it the connotations of losing information about components in quite the same way the the infix notation does. This is in part because (in the monoid of the natural numbers under multiplication) there are two ways to write 60:
- 60
- 2*2*3*5
{m1;m2;m3} = {{m1;m2};m3} = {m1;{m2;m3}}
we don't lose information about the components. Of course, we didn't in arithmetic multiplication, either, but there is an ambient set of connotations suggesting that we have.
What is clear at this stage is that -- despite all the shadowy suggestions of things to try lurking in the "sub" of our consciousness -- both notations really, really denote a kind of container. Composition -- at least the sort of composition intended by these syntaxes -- puts things into containers. From this perspective, both notations have their advantages and disadvantages as one tries to extend their scope of application to different kinds of containers.
Let me come clean and say that having spent some considerable time with Scheme and Lisp and Set Theory, i have a slight personal inclination for the braced notation. i think, however, that there are other, more objective, factors that give it a little bit of an edge over the infix notation for combination.
Chief among these is the power of the notion of monad. The notion of monad is essentially a parametric notion of combination. It provides just the right machinery to vary the notion of composition over a very wide class of "things". To get this powerful sort of information compression, the notion demands is that we must supply certain things. To simplify discussion i will focus on two of them.
- unit
- mult
It is in the context of mult that our discussion of the abstract(ly) flattened list takes on a deeper signficance. Further, we can see a real departure between the infix versus braced notation. The infix notation clearly makes a commitment to a "pre-flattened" expression. One must add some sort of braces to recover a history of term formation. The braced notation makes no such commitment. It is possible to restrict oneself to writing only flat expressions; but, one can also write nested expressions. Thus, the notation begs for a clear statement of the relationship between flat and nested expressions. That fits very well with the monad's demand for a specification of the mult procedure -- it is precisely a specification of a canonical way to get from nested expressions to flat ones.
Both notations support populations of different compositors. We have the very familar '+' and '*', as well as '&' and '||' in the infix case -- and, of course, if i were more adventurous with my fonts we could easily extend this will all manner of dots and circles and boxes. Personally, i think of these as colors of some basic infix mark, say '*'. Likewise we have at least 4 different colors of braces in the ascii character set: (,); [,];{,};',' (5 if you consider double quote separately). The infix notation puts the color where the action is, while in a braced notation that keeps the separator constant and varies only the color of the braces, it pushes the color to the boundaries of the expression.
[{ a; b };c] = {[a;c];[b;c]}
As we can see, distribution laws are slightly more cumbersome in the braced notation, but not overly so. The braced notation, however, has a major advantage when we begin to reify the notion of color. Here is a make-shift version in EBNF
e ::= {c| [e[;e]*] |d}
c,d ::= ...
Again, being conscious of the suggested menu of things to try lurking in the background of every notation, i suggest that the syntactic overlap with Dirac's bra-ket notation is exactly the sort of thing we want to encourage.
The primary take-away of this post is that composition implies a notion of container. The secondary take-away is that notation supports and suggests different kinds of thinking. This relates to the previous post in the following way. If you look carefully at the structures identified there and then you read this paper by Vincent Blondel, you will see that the structures we independently derive are iso; but the equations interpreting the operations are not. Specifically, i assume the operators to be defined independently of the "syntax" and demand that we have up ($) and down (@) operations between the operators.
It turns out that i read Blondel's paper 10 years ago. i even wrote to him asking him how his ideas related to Conway's ideas. i haven't really thought about it much since then. It was only after i started thinking about the associativity problem implicit in the formulation given in the previous post that i recalled Blondel's results (and then incorrectly) and after some considerable googling found them. (It turns out that no phrase including 'Knuth', 'binary tree' and 'up arrow notation' yielded a hit. However 'exponentiation' and 'binary tree' did finally lead me to Blondel's paper.)
More semantically, we can abstractly think of quotation -- the intuition that guided me to the construction in the previous post -- as a kind of 'embrace'; but there are other suggested things to try on the quotation menu than there are on the binary tree menu. When considered in context this is really note worthy, imho. The order of events in the development of my thinking was this
- π-calculus demands an infinitary supply of atoms -- this is computationally unacceptable;
- the theory of names as quoted terms solves this;
- the π-calculus' 'new' operator is a very compact notation for certain kinds of functor category machinery, which can be repurposed to many other situations; the free monoid on infinitely many generators is only one;
- however, once we repurpose this operator, we can employ the quotation trick to eliminate the 'new' operator;
- lastly, the theory of quotation comes with a beautiful analog of Galois extension that allows us to go considerably beyond the capabilities of the 'new' operator.
1 comment:
Hi, Greg--this is Brad Hogg, one of the beginners from the Gig Harbor course. Is it alright if I link to this post on my blog (not this unused Blogger account)? I need to read through it a couple more times to begin to understand it, but I like what I'm surface-scraping, so far.
Be well,
Brad
Post a Comment