Backus on Von Neumann at the Turing Award

Can Programming Be Liberated from the von Neumann Style?

On a pre­vi­ous post, we talked about Von Neumann’s Turing Complete mod­el for a com­puter. A CPU, some Memory, and I/O. It was beau­ti­ful, but it had its dis­ad­vant­ages, and an unfor­tu­nate leg­acy. Today, John Backus will lead our jour­ney through these matters.

Enter Sandman: Backus

32 years later, a thor­ough cri­tique was addressed, iron­ic­ally, by no one else than the pion­eer John Backus him­self. Backus, the guy who developed the form­al nota­tion for a con­text-free gram­mar known as the Backus-Naur Form, the guy who led the first com­piler­’s devel­op­ment in his­tory, that of Fortran, a lan­guage deeply rooted in the Von Neumann mod­el, for Von Neumann machines. During his renowned Turing Award speech, prize giv­en for the afore­men­tioned mer­its, he iron­ic­ally slashed it.

The paper, later pub­lished as “Can pro­gram­ming be lib­er­ated from the von Neumann style?”, is a must for any pro­gram­mer wanna-be. It’s so bril­liant, that it’s hard for me to say any­thing about it oth­er than telling you: go read it your­self. Seriously, read it.

Ok, but I’ll give it a try non­ethe­less. As an abbre­vi­ation, {p.} will hereby denote “para­phras­ing Backus again”, and curs­ives denote lit­er­al quo­ta­tion. I guess you get it.

The Obesity

Let me ask you: in which lan­guage do you pro­gram or you’re con­fid­ent the most? Java? C++? JS? Do you think you know everything about the lan­guage? Would you be able to build a pars­er for your lan­guage? Answer the most intric­ate ques­tions? Have you even read the stand­ard doc­u­ment­a­tion? Do you know how big they are?

Let me give you some examples:

  • Fortran 66, 39 pages!
  • C++. The first offi­cial stand­ard is from C++98: 776 pages. Big big revi­sion, C++11: 1368 pages.
  • Java 6, 684 pages. Java 8, 778 pages.

The first thing that Backus notes, is how lan­guages have been fol­low­ing this ter­rible trend of obesity, grow­ing ever more enorm­ous, bot not stronger.

The Bottleneck

Why is this hap­pen­ing? {p.}[§3] When we think that a Von Neumann machine con­sists in send­ing words back and forth between the CPU and Memory, and that a large part of this traffic is not data but names of data, no won­der we can call this com­mu­nic­a­tion chan­nel a bot­tle­neck. Backus argues [§4] that con­ven­tion­al lan­guages are basic­ally “high level, com­plex ver­sions of the Von Neumann com­puter”. This is not sur­pris­ing at all when you think about it: “V.N. lan­guages use vari­ables to imit­ate the com­puter­’s stor­age cells, con­trol state­ments (con­di­tion­als and loops) elab­or­ate its jump and test instruc­tions, and assign­ment state­ments imit­ate its fetch­ing, stor­ing, and arith­met­ic.

And the assign­ment is the bot­tle­neck of these lan­guages: [cci]a = b + 2[/cci] means: fetch [cci]b[/cci] from memory, load 2 from instruc­tion, add it, post the value back to memory in [cci]a[/cci]„s place. A typ­ic­al pro­gram essen­tially con­sists, {p.} [§4, p.616], on a series of assign­ments to be executed sequen­tially, and the main con­cern of a pro­gram is hand­ling this flow of words through the bottleneck.

The disorderly world

Assignments have many dis­ad­vant­ages. Giving names to val­ues means names must be care­fully scoped and hid­den to avoid namespace clut­ter­ing, name col­li­sion and ali­asing, and what­not. It also {p.} splits pro­gram­ming into two worlds. To the right, an orderly world of algeb­ra­ic prop­er­ties, to the left, a dis­orderly world of state­ments. Let’s give Backus“ example:

A comparison of an inner product

An inner product in a C‑like language:
[cc_cpp]
int c = 0;
for (int i = 0; i < n; i++) {
c = c + a[i] * b[i];
}
[/cc_cpp]

Or in Backus pseudo-lan­guage with mod­ern notation:

[cc]
innerProduct = (fold +) . (map [fn (x,y) -> x * y]) . zip
[/cc]

The Von Neumann ver­sion is the one we all know. It’s repet­it­ive, and one must men­tally execute it to under­stand it. While it’s con­cerned with how the product is cal­cu­lated; the func­tion­al ver­sion describes what an inner product is, by com­pos­ing oth­er func­tion­al pieces. 1

Big remarks should be done here, so fol­low­ing Backus’s style, I’ll enu­mer­ate them:

  1. C oper­ates on an invis­ible state, pulling and pump­ing words through the Von Neumann bot­tle­neck word-at-a-time, while F oper­ates exclus­ively on whole con­cep­tu­al units, its arguments
  2. C does not con­struct com­plex entit­ies over sim­pler ones, and it names its argu­ments: apart from the hurdles men­tioned before, to become gen­er­ic it requires a pro­ced­ur­al declar­a­tion; F is obvi­ously a gen­er­ic com­pos­i­tion of smal­ler pieces.
  3. C is not pure: the state is merely a glor­i­fied glob­al vari­able, and side-effects might be noticed in oth­er threads, even dur­ing inter­me­di­ate steps; F is pure, with no com­plex trans­ition rules and no side-effects: data has no names nor addresses that could be leaked.

The problem

And it gets worse, when we real­ise that {p.} this pro­cess of obesity has made lan­guage devel­op­ment much less excit­ing, mak­ing it a world of big manu­als rather than that of innov­at­ive ideas, and divert­ing all cre­ativ­ity to the fat­ten­ing of estab­lished lan­guages rather than to the explor­a­tion of new ideas. Hardware is developed for this lan­guages because they have the biggest mar­ket share; lan­guages are developed for this hard­ware merely for the same reason.

These lan­guages don’t even have any use­ful math­em­at­ic­al prop­erty to reas­on about. Aliasing and impur­ity opens the door to incred­ibly com­plex spe­cific­a­tions and optim­iz­a­tions I’ll talk about in the future; and then again they require Byzantines spe­cific­a­tions where the semantics are lim­ited to oper­a­tion­al ones: it’s all about spe­cify­ing a trans­ition func­tion for any giv­en term. Programs can­’t be proven cor­rect if there’s no algeb­ra­ic rela­tion nor logic­al deduction.

The Hope

Backus, as a good research­er, actu­ally spends most of his paper draft­ing an altern­at­ive. His cri­tiques and pro­pos­i­tions are actu­ally one of the biggest advert­ise­ments func­tion­al pro­gram­ming ever got, and this top­ic has evolved a lot since then, that’s why I’m not spend­ing a pro­por­tion­al time to the altern­at­ives than to the prob­lem, because I will indeed talk about the whole top­ic from a mod­ern per­spect­ive quite a lot. But I won’t go away without at least out­lining the core ideas from Backus.

In [§11–14], Backus presents his new lan­guage, called FP. This present­a­tion is divided in four pieces: an inform­al func­tion­al sys­tem, an algebra of func­tion­al pro­grams, a form­al sys­tem, and an applic­at­ive state trans­ition sys­tem. The func­tion­al sys­tem is made of objects (data), func­tions (map­pings from objects to objects), func­tion­al forms (map­pings of func­tions to func­tions), a notion of func­tion applic­a­tion, and a set of pre­defined and pre-named func­tions (pro­jec­tions of a tuple, head and tail of a list, iden­tity, length, fold, Boolean oper­at­ors, append) any­body acquain­ted with func­tion­al pro­gram­ming is famil­i­ar with. Some stand­ard func­tion­al forms are com­pos­i­tion, tupling, map, fold, zip, con­di­tion (pos­sibly unde­fin­ing), and a while con­struct (but no men­tion of a fixed-point oper­at­or! 2). This mod­el has semantics decoupled from the state: eval­u­ation, much closer to the $latex \lambda$-calculus mod­el, con­sists on sym­bol replace­ment and equa­tion­al expan­sions and reduc­tions, and Backus proves in [§11.4.3] that any oth­er sys­tem con­tain­ing this one, would be just as power­ful, hence prov­ing that FP is complete.

Backus then goes and proves, much in the style of how mod­ern lan­guages would prove using con­structs from Category Theory, but without know­ing any­thing about Category Theory, cer­tain algeb­ra­ic prop­er­ties of this lan­guage: tupling and com­pos­ing are com­mut­at­ive ($latex [f, g] \circ h \equiv [f \circ h,g \circ h]$), prop­er­ties of the pre­defined func­tions, or more com­plex the­or­ems of recur­sion and expan­sion, which Backus proves quite form­ally. In [§12.10], he remarks some open ques­tions for research in this topic.

At last, an Applicative State Transition System is described to mod­el his­tory-sens­it­ive­ness: pro­grams need to have state nev­er­the­less, and this state needs to be remembered, that is, vis­ible in a sub­sequent exe­cu­tion of the pro­gram. But Backus argues semantics need not mod­el small-step state trans­itions: rather, a state can be com­puted in big step semantics: “A new state is com­puted along with out­put and replaces the old state when out­put is issued”.

The Summary

At the expense of not going much into detail on the answers Backus gives, I want to remark the ques­tions he opens in this excep­tion­al paper. For any­body work­ing any­where close to com­puter sci­ences, this paper is an eye-open­er, a good remind­er of how import­ant it is to ques­tion everything, and not to take the estab­lish­ment as granted.

To me, this opens a whole new ques­tion: the rela­tion­ship between hard­ware and lan­guages. If I have been deeply inter­ested in lan­guages for a long time, it was this paper what motiv­ated me to write about com­puters pub­licly, when I real­ised that, at last, lan­guages are often mod­elled on top of an abstract machine, often not decoupled away from it, which it should be.

Bear with me when I explore these ideas of abstract machines and hard­ware devel­op­ment, not to men­tion Programming Languages Theory along with math­em­at­ic­al con­structs such as Type Theory and Category Theory.


  1. In my revi­sion of Backus’s ver­sion, zip is a func­tion that takes two lists and returns a lists of pairs (the ele­ment-wise pairs of ele­ments of the input lists); map is a func­tion that takes a func­tion and a list, and returns a new list of ele­ments made of apply­ing the func­tion to each ele­ment of the input list; and fold is a func­tion that takes a func­tion and a list, and returns the ele­ment of oper­at­ing the input func­tion to the ele­ments of the input list in a seri­al way, accu­mu­lat­ing the value.
  2. I’ll go through the fixed-point oper­at­or some day in the future.

2 Replies to “Backus on Von Neumann at the Turing Award”

  1. […] Last time, we dug into a new debate: are lan­guages really decoupled from the hard­ware that runs them? Are the semantics of a lan­guage mod­el­ling main­stream hard­ware, and should they? Can we escape to new mod­els of hard­ware, and new mod­els of semantics? […]

  2. […] Next, we’ll see how this mod­el con­di­tioned Hardware and Programming for times to come. And the cri­ti­cism will come from per­haps a man that inflic­ted more dam­age to these wounds. Stay tuned! […]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.