One of my co-worker forwarded to me the following allegory:
THE THREE CORPORATE LESSONS
LESSON NUMBER ONE
A crow was sitting on a tree, doing nothing all Day. A small rabbit saw the crow, and asked him, “Can I also sit like you and do nothing all day long?” The crow answered: “Sure, why not.” So, the rabbit sat on the ground below the crow, and rested. All of a sudden, a fox appeared, jumped on the rabbit and ate it.
Moral of the story is: To be sitting and doing nothing , you must be sitting very, very high up.
LESSON NUMBER TWO
A turkey was chatting with a bull. “I would love to be able to get to the top of that tree,” sighed the turkey, “but I haven't got the energy.” “Well, why don't you nibble on some of my droppings?” replied the bull. “They're packed with nutrients.” The turkey pecked at a lump of dung and found that it actually gave him enough strength to reach the first branch of the tree. The next day, after eating some more dung, he reached the second branch. Finally after a fortnight, there he was proudly perched at the top of the tree. Soon he was promptly spotted by a farmer, who shot the turkey out of the tree.
Moral of the story: Bullshit might get you to the top, but it won't keep you there.
LESSON NUMBER THREE
A little bird was flying south for the winter. It was so cold,the bird froze and fell to the ground in a large field. While it was lying there, a cow came by and dropped some dung on it. As the frozen bird lay there in the pile of cow dung, it began to realize how warm it was. The dung was actually thawing him out! He lay there all warm and happy, and soon began tossing for joy.
A passing cat heard the bird singing and came to investigate. Following the sound, the cat discovered the bird under the pile of cow dung, and promptly dug him out and ate him!
The morals of this story is:
- ① Not everyone who drops shit on you is your enemy.
- ② Not everyone who gets you out of shit is your friend.
- ③ And when you're in deep shit, keep your mouth shut.
I haven't read such fat allegory for years. A little search on google.com showed that apparently this is circulating on the net.
Allegory is a powerful device for nailing a point. Similar to simile, analogous to analogy, figures like figures of speech, it makes you see something that's otherwise hard to see, or refused to see. It's kinda like a trap. You start to read with amusement about animals and their affairs, but by the end of the story some moral you don't want to hear dawns and seizes you by force.
However, just like analogies, there is a problem with them: they have absolutely nothing to do with facts or truths. Even though their palpability pushed your button, they actually proved nothing. Like, you don't see math proofs littered with analogies.
Looking at the above parable, we could ask: “Can cows and bulls and shit really prove something about modern corporate environment?” Of course, you won't seriously consider asking that if you are not a turkey.
In Erik Naggum's last message, he relied on the analogy of unix shells ＆ DOS to propound his belief that the ability of a single name having multiple meanings in a computer language is advantageous. In analogy, i'm using an allegory to illustrate the vacuity of his method of persuasion.
By the way, the unix shells and environment variable and ways, is quite a f���ed up one. It is amazing to see its stupidities alluded as an advance for some language design argument. The whole morbidity of the prospect to place an executable script as any program name in any path with the f���ed up ways to search for programs to execute and the f���ed up way to determine whether it is a program by the f���ed up permission bits system is one giant unpurgeable shit pile arose from ad hoc hacks of unixism. 〔➤ Unix Pipe as Functional Language〕
In defense of Common Lisps' meaning-space problems, Erik Naggum has a favorite analogy, that people have no problem dealing with English words that are both noun and verb.
This is similar to Larry Wall's habit of using de facto human languages to defend status quo as design merits; contriving that Perl is such and such finely “designed” because English this and that. (Kent Pitman falls in the same pit.)
In the last 100 years or so, we have made tremendous advances in AI related sciences. Logic, computer sciences, language theories, cognitive psychology, neuroscience, unimaginable mountains of discrete mathematics. Only in the last 60 years or so ago, we human beings were _able_ to conceive and _build_ constructed languages like loglan. We do not yet begin to have much info on how specially constructed language like loglan/lojban can effect human thinking as a native language. 〔➤ Xah's lojban Tutorial〕 The Larry Wall type of moron, seems to have already decided that the status quo natural languages like English is superior, or have no facilities for imagination.
This line of moronicity are typical of sightless visionaries. They see the present, and they deduce that it is the best of possible worlds, and they pout and cry that the present state of things is the best state of things and must be pervasively maintained and guarded. They take an active stance to smite down those mathematicians who cater for tomorrow, and who brought them today's common sense yesterday.
Open the book of history, and we shall see that when irrational number were discovered and introduced, there were insistent naysayers. When Arabic number system where introduced, these naysays again we encounter. When new calender were introduced, again these morons. When machinery were introduced, we have Luddites. When contraceptives were introduced, we have Christians. When negative numbers were introduced, when “imaginary numbers” were introduced, when set theory were introduced, when non-Euclidean geometry were introduced, when type-writers were introduced, when computational mathematics were introduced, when functional programing were introduced… these f���ing naysaying morons are the fighters of progress, fighting to keep the world standstill in their complacency or ignorance.
The fact is, if the world is not filled with these morons in totality, then scientific advances and new concepts and technologies are inevitable, only a matter of time. Concepts, such as Scheme's single namespace, or pure and non-strict functional languages, or other advanced ideas with superior mathematics basis, will mature and prevail. First generation, legacy, and fads like C, Common Lisp, Perl will die. Like a force of nature, inevitable, only a matter of time.
The key to intellectual progress is science. The fuel to all sciences is mathematics.
There is a difference between science and pseudo-science. Alonzo Church's stuff, for example, is the former. Larry Wall's stuff, is the latter. Larry Wall's crime, is that he trumpets his pseudo-science as science, using humor as his mask.
From: Erik Naggum Organization: Naggum Software, Oslo, Norway Newsgroups: comp.lang.lisp Date: 09 Mar 2001 08:05:55 +0000 Subject: Re: Separate namespaces [was: Re: please tell me the design faults] * Bruce Hoult 〔email@example.com〕 │ As someone coming to the discussion from a background of using C++ and │ Java and other similar languages I'm just trying to find out what the │ big advantage of having two namespaces is. The only languages I've │ previously used with multiple namespaces are C (for structs, and C++ has │ backtracked from that) and Perl (which has about half a dozen │ namespaces). Do you use an environment where you can give commands to a shell? Have you noticed that the first word of a command is treated differently than all the other words? It is looked for as internal commands to the shell, and it is searched for in directories in a PATH variable of some kind, in case you are unfamiliar with it. In the MS-DOS world, the name of a command is searched for with a particular extension (type). In the Unix world, the file so named would have to the execute bit set and . would have to be in the search path for the file to be eligible as a command, but normally, neither of these conditions are met. In both cases, this means that you can name a file in your local directory the same as the command, and there will be no confusion about which is command and which is local file. I hope this is so simple you can understand that you are already using, and accepting, an environment with two namespaces. That you can name your files anything you want and not affect the execution of any scripts or other programs that may invoke other programs that may be called the same by accident is a big win. That you can change to a different directory and not be suprised by trojan horses there just because a file is named the same as a command is a big win. (One prank pulled on ignorant students at the time people thought . in $PATH was convenient _and_ safe, was to place an executable file named "ls" in directories that others would likely snoop in.) Now, can you _imagine_ why anyone would want to name files in a local directory accidentally the same as some command someplace in the search list and blithely expect the command to work and the file to be seen as a simple file? Perhaps the fact that you don't have full control over the growth of the command namespace can be a clue. You _don't_ want a file you have had lying around for years to inhibit you from using a new program. Perhaps just the freedom to name files as you like is enough of a value for people that it would be a undue burden to make certain that you don't make a command unavailable. I suppose I'm wasting my time, again, being as you are so dense that you don't see anything that looks like clues to see why a namespace for functions different from variables makes sense, but it is a result of the desire for scalability at all levels. In particular, in Common Lisp we don't want to change the meaning of a function in some _other_ package just because its symbol is accessible in our package by using it as a variable. We even ensure that we don't step on other packages' symbols by using *foo* for global variables and foo for functions, so they split the one _symbol_ namespace amongst them, as well. All of this is very carefully thought out and the practice of Common Lisp is very different from languages where scalability is an after-thought. I suppose you'll dismiss this as irrelevant, again, being as you are so amazingly stupid to believe in omniscience and people knowing _exactly_ this and _exactly_ that, but maybe, just _maybe_, there's a remnant of working brain that might make you realize that you have been using a system with just this separation of functions from variables all along, and the reason it is like that is that it scales better than any other approach, and it gives you freedom from worry that you nuke commands by naming your files what you think is best. │ So in the end it doesn't appear to provide any compelling advantage or │ disadvantage. People who use CL seem happy enough with it. People who │ use other languages don't seem to feel any burning desire to have it. You're mistaken about the last part, and the first part is simply a statement of your staggering desire to remain ignorant, nothing else. │ There being no significant difference between the two, I'd also choose │ the simpler concept -- a single namespace and evaluation rule -- for any │ new language I happened to be involved with designing. Then implement this in your shell or other command processor and let us know how comfortable you are with it after a while. Search the current working directory first, and don't exclude files without an execute bit under Unix and look for files regardless of file type under MS-DOS. If we can take your above paragraph as an indication, you would actually design a shell or command processor that made no distinction between files at all and would happily try to execute non-executable files. Now, can you _imagine_ why Unix has execute bits and MS-DOS .EXE and the like? Does it make _any_ sense to you to try to distinguish functions from variables in the file system? Do you _really_ think it's that different from programming languages that _nothing_ can be learned from the need for scalability and convenience and a shot at _security_ in shells? If I had to deal with a computer that did not allow me call a file "cat" because there was a command by that name, I'd consider it broken as designed, and that's exactly what I feel about Scheme and Dylan and other retarded languages that conflate the function namespace with the variable namespace. Sometimes, I think you one-namespace guys are just plain idiots, but it's probably a cultural thing -- you don't know any better because you never saw any better. That would be OK, but when you're as stupid as to _refuse_ to _listen_ to people who know a better way, it's no longer a cultural thing, it's stupidity by choice. And yes, I actually _do_ think of the Lisps I use as shells. I live _in_ Emacs and _in_ Allegro CL as well as _in_ the Unix shell (bash). All of them enforce a separation of functions/commands from variables/data files. Oh, I come from a Unix/C background. The first time I was annoyed by the conflated namespaces in C was when I couldn't call a variable "time" when I wanted to call "time" in the same function. That was in 1980, mere weeks after I first used a Unix system. It was intuitively evident then and it has remained so, that functions and variables are different. That I hadn't run into the problem before is sheer luck, plus I had used a few systems where it was naturally a difference so I wouldn't have noticed it if I had "exploited" the difference. My experience leads me to believe that one-namespace-ness is a learned thing, an acquired braindamage. #:Erik -- "Hope is contagious" -- American Cancer Society "Despair is more contagious" -- British Farmers Society
The above is originally posted to comp.lang.lisp.
Subject: Re: Separate namespaces From: Xah Lee Newsgroups: comp.lang.lisp Date: Mon, 19 Mar 2001 13:47:33 GMT https://groups.google.com/group/comp.lang.lisp/msg/7453b340d08fce71Disqus