Hardware Modeled (Von Neumann) Computer Languages and Functional-Level Languages

By Xah Lee. Date:

Ok, here's some things i learned from today's Wikipedia reading.

One is the term: Von Neumann programming languages .

Effectively, that's those of C, C++, Java, sh, perl and the bunch of garbage to no ends.

In short, “Von Neumann programnig lang” are those modeled on computer hardware. Namely, storage and control flow. (thus the bunch of garbage such as memory address, pointers, file handle, and lisp's cons.)

I find the discovery of this term very illuminating. I used to use pages of words to describe the low-level-ness that exhibit in these stupid langs. The term “Von Neumann programnig lang” gives a concise, quotable, insight on why it is so. (for a example where i used pages of words to describe problems of these type of langs, see: Jargons and High Level Languages.)

Another term similar to Von Neumann programing lang, is Imperative programming, though it captures the stupidity of these langs from a different perspective. The concept in Von Neumann programing lang is a better one because it hints on the origin of the stupidities in this class of langs.

The opposite of Von Neumann programnig lang would be “Mathematics modeled languages”. Lisp, Haskell, Mathematica, APL, Prolog etc are this class (in general called Declarative programing languages, in contrast to imperative programing.).

A more communicative term for “Von Neumann programnig lang” should be computer hardware modeled languages, but of course that loses the buzzword quality.

Lets start to throw the jargon “von neumann programnig language” in C, C++, Java, Perl etc forums. Seriously. It'll help benefit programing community in the education aspect by raising awareness.

Function-Level Programming Languages

I also learned today, about what's called Function-level programming .

Hard to summarize in one sentence… but basically like functional programing but with one characteristic formalism that sets it apart, namely: creation of functions are limited to a particular set of higher-order functions, and you cannot arbitrarily birth functions (For example, the moronicity of lisp's macros).

The force of this particular formalism is that it makes it more subject to mathematical analysis (and thus makes it more powerful and flexible), similar to for example the clear separation of features in 2nd order logic from first order logic. Wikipedia said it best, quote:

This restriction means that functions in FP are a module (generated by the built-in functions) over the algebra of functional forms, and are thus algebraically tractable. For instance, the general question of equality of two functions is equivalent to the halting problem, and is undecidable, but equality of two functions in FP is just equality in the algebra, and thus (Backus imagines) easier.

Even today, many users of lambda style languages often misinterpret Backus' function-level approach as a restrictive variant of the lambda style, which is a de facto value-level style. In fact, Backus would not have disagreed with the 'restrictive' accusation: he argued that it was precisely due to such restrictions that a well-formed mathematical space could arise, in a manner analogous to the way structured programming limits programming to a restricted version of all the control-flow possibilities available in plain, unrestricted unstructured programs.