There is a proliferation of computer languages today like never before. In this page, i list some of them.
In the following, i try to list some of the languagess that are created after 2000, or become very active after 2000.
Lisp family or similar:
ML/OCaml derived Proof systems in industrial use:
〔☛ State of Theorem Proving Systems 2008〕
Modern Functional languages:
Perl family or derivative:
On Java Virtual Machine:
2D graphics related.
Following are some random comments on comp languages.
In the above, i tried to not list implementations. (⁖ huge number of Scheme implemented in JVM with fluffs here and there; also ⁖ JPython, JRuby, and quite a lot more.) Also, i tried to avoid minor derivatives or variations. Also, i tried to avoid languages that's one-man's fancy with little following.
In the above, i tried to list only “new” languages that are born or seen with high activity or awareness after 2000. But without this criterion, there are quite a few staples that still have significant user base. ⁖ APL, Fortran, Cobol, Forth, Logo (many variants), Pascal (Ada, Modula, Delphi). And others that are today top 10 most popular languages: C++, Visual Basic.
The user base of the languages differ by some magnitude. Some, such as for example PHP, C#, are within the top 10 most popular language with active users. Some others, are niche but still with sizable user base, such as LSL, Erlang, Mathematica. Others are niche but robust and industrial (counting academia), such as Coq (a proof system), Processing, PLT Scheme, AutoLISP. Few are mostly academic followed with handful of researchers or experimenters, Qi, Arc, Mercury, Q, Concurrent Clean are probably examples.
For those of you developers of Java, Perl, Python for example, it would be fruitful to spend a hour or 2 to look at the Wikipedia articles about these, or their home pages. Wikipedia has several pages that is a listing of comp languages, of which you can read about perhaps over 2 hundreds of languages.
I was prompted to have a scan at these new language because recently i wrote a article titled Fundamental Problems of Lisp, which mentioned my impression of a proliferation of languages (and all sorts of computing tools and applications). Quote:
In general, creating a language is relatively easy to do in comparison to equivalent-sized programing tasks in the industry (such as, for example, writing robust signal processing lib, a web server (⁖ video web server), a web app framework, a game engine … etc.). Computing tasks typically have a goal, where all sorts of complexities and nit-gritty detail arise in the coding process. Creating a language often is simply based on a individual's creativity that doesn't have much fixed constraints, much as in painting or sculpting. Many languages that have become popular, in fact arose this way. Popularly known examples includes Perl, Python, Ruby, Perl6, Arc. Creating a language requires the skill of writing a compiler though, which isn't trivial, but today with mega proliferation of tools, even the need for compiler writing skill is reduced. (⁖ Arc, various languages on JVM. (10 years ago, writing a parser is mostly not required due to existing tools such as lex/yacc))
Some language are created to solve a immediate problem or need. Mathematica, Adobe Flash's ActionScript, Emacs Lisp, LSL would be good examples. Some are created as computer science research byproducts, usually using or resulting a new computing model. Lisp, Prolog, SmallTalk, Haskell, Qi, Concurrent Clean, are of this type.
Looking at some tens of languages, one might think that there might be some unifying factor, some unifying theory or model, that limits the potential creation to a small set of types, classes, models. With influence from Stephen Wolfram book “A New Kind of Science” 〔☛ Notes on A New Kind of Science〕 , i'd think this is not so. That is to say, different languages are potentially endless, and each can become quite useful or important or with sizable user base. In other words, i think there's no theoretical basis that would govern what languages will be popular due to its technical/mathematical properties. Perhaps another way to phrase this imprecise thought is that, languages will keep proliferating, and even if we don't count languages that created by one-man's fancy, there will still probably be forever birth of languages, and they will all be useful or solve some niche problem, because there is no theoretical or technical reason that sometimes in the future there would be one language that can be fittingly used to solve all computing problems.
Also, the possibilities of language's syntax are basically unlimited, even considering the constraint that they be practical and human readable. So, any joe, can potentially create a new syntax. The syntax of existing languages, when compared to the number of all potentially possible (human readable) syntax, are probably a very small fraction. That is to say, even with so many existing languages today with their wildly differing syntax, we probably are just seeing a few pixels in a computer screen.
Also note here all languages mentioned here are all plain-text linear ones. Spread sheet and visual programing languages would be example of 2D syntax… but i haven't thought about how they can be classified as syntax. (nor do i fully understand the ontology of syntax )
Just some extempore thoughts.
New programing languages.
Discovered a new programing language. Factor. See: Point Free Programing.
There are endless languages. Just discovered Fantom.
Also, there's this scathing review of Scala by someone who seems to be a Java platform expert.
Scala feels like EJB 2, and other thoughts By Stephen Colebourne. @ blog.joda.org…
Ι don't know Java platform well, not having worked in the industry for many years, but i tend to agree with him on his views.
Google created Google Dart. Microsoft created TypeScript. See: Microsoft's TypeScript Will Kill CoffeeScript and Dart!.
Then, in ≈, asm.js is created. Home page at http://asmjs.org/.
Here's some blogs about it:
just learned there's a new lang: Rust, developed by Mozilla Labs. Started in 2010. Quote from Wikipedia:
Rust is an experimental, concurrent, multi-paradigm, compiled programming language developed by Mozilla Labs. It is designed to be practical, supporting pure-functional, concurrent-actor, imperative-procedural, and object-oriented styles.
The lead developer is Graydon Hoare, who began work on the system in 2006; Mozilla became involved in 2009, and officially unveiled the language for the first time at Mozilla Summit 2010. In 2010, work shifted from the initial compiler, written in OCaml, to the self-hosted compiler written in Rust itself. It successfully compiled itself the following year. The self-hosted compiler uses LLVM as its backend.
Julia is a new lang for scientific computing. New around 2011: http://julialang.org/
it's a competitor to number crunchers R, MATLAB, and others.
interestingly, on Julia site, it shows that it's a order faster than all similar scripting language. Julia's speed seems to be close to optimized C code, and is a stated goal of the language design.
A lang based on Parsing Expression Grammar, called OMeta, by Alessandro Warth.
OMeta: an Object-Oriented Language for Pattern Matching By Alessandro Warth. @ tinlizzie.org…
Tom Novelli provided the following insights and links:
Yeah, that seems promising. there's actually a long history behind it; see for example 〔Pragmatic Parsing in Common Lisp By Henry G Baker, Nimble Computer Corporation. @ home.pipeline.com…〕 I found a copy of Val Schorre's META-II paper from 50 years ago… and some followup work throughout the 1960s (⁖ TREE-META)… these guys were in Douglas Englebart's group. Then apparently the military took it over and made it classified, probably ruined it with bureaucracy rather than developing it into awesome top-secret technology, heh ☺
learned about Elm (programming language). http://elm-lang.org/