You are viewing fare

eyes black and white

(Lots of ((Irritating, Spurious) (Parentheses)))

Derisive comments are often made about the syntax of Lisp, as witness some reproaches on my previous blog entry. Thus the half-joking, half-serious backronym of Lots of (Insipid | Irritating | Infuriating | Idiotic | ...) and (Spurious | Stubborn | Superfluous | Silly | ...) Parentheses, and accusations that Lisp syntax would make code incomprehensible to read and error-prone to write. I will take exception to this general kind of comments, and I will argue in defense of the Lisp syntax.

Firstly, the nested parenthesized syntax is actually not bad at all, as far as understanding goes. It is actually so simple that anyone can grasp it and master it in 20 minutes. Parentheses might be tricky to balance by hand, but since the 1970s at least, interactive development environments match parentheses visually for you, so you never have to worry about counting them. Moreover, properly indenting code, which may be done automatically by editors, makes code quite readable. Using Lisp with improper user interfaces meant for other languages may be tedious, but this is hardly a technical claim against Lisp.

By comparison, it took me many weeks to initially master the intricacies of the C syntax: the operator precedence, the semi-colon as terminator not separator, the confusion between assignment and comparison, the overloading of the parenthesis and comma syntax (precedence overriding, function calling, or sequencing of expressions?), the misleading (and in C++ ambiguous) similarity between declarations and statements, the trap of nested braceless if-else constructs, the trickiness of declaring type of pointers to functions, the weird limitations of the preprocessor and its macros, etc.

And then, I haven't even started talking about the limitations to the semantics of C, that force you to go through the pains of syntactically and semantically ugly work-arounds. Limitations on scoping, on returning of results, on static or dynamic initialization and finalization, on error handling, etc., induce a lot of error-prone inefficient pointer manipulations and gotos. Not only are multiple results and dynamic bindings/windings simpler for humans to read and write, but with them, compiled Lisp code may often win over compiled C or C++ code thanks to better calling conventions. And yes, this is related to the question of syntax, in as much as the regular nestable syntax of Lisp makes for a natural nestable recursive semantics, whereas the ad-hoc syntax of C makes for non-nestable semantics with an ad-hoc hierarchy of levels.

In the same comment to my blog entry, Lisp syntax is accused of inducing endless levels of nesting. I dispute this notion. I find that typically, the depth of nested structures in Lisp programs is not higher than the depth of a parse tree in typical Perl or C programs; however, the shape of the source trees is indeed different -- and that's the very purpose of having a different syntax: it's not just changing the surface of the syntax, but also its contents. And the contents are such that with the proper macrology, you can get more done for a same amount of manageable complexity: program structure naturally grows until it hits the barrier of human understanding; what changes from one language to another is not this barrier, builtin in man, but the amount of useful things you can express within this barrier; and as argued in my previous post, Lisp syntax is a key part of what enable programmers to express more through readily-available metaprogramming practices.

One argument against the Lisp syntax is the alleged redundancy of parentheses and keywords, which makes for low code density per line. However, (1) the semantics of Lisp programs makes up for it largely with the need of much fewer lines of code for same functionality, making overall code density of Lisp is higher than in any other language; (2) Lisp syntax and semantics can be extended and are casually extended with reader macros and macros (if you don't know what they are, you've pretty much missed the whole point of Lisp), so that you can match exactly the code density and readability you want for any specific domain where you type lots of code; and (3) despite the long readable symbol names, completion allows for fast typing and local bindings allow for short code. If the argument were really about high code density per line, one should not advocate any of the mainstream programming languages; one should much rather advocate APL, TECO, Mathematica, and other such languages that make colloquial programs look much more like line-noise to non-fluent readers than proverbial Perl programs ever will. Actually, APL and its current successors J or K are languages worthy of much respect.

Another argument against Lisp syntax is that its parentheses do not help distinguish between important semantic distinctions within program code. Well, the claim is true, but it's not a valid argument, it's a non-sequitur. Indeed, in Lisp, parentheses indicate code nesting, but not semantic distinctions; that doesn't mean these distinctions cannot be made easily in Lisp; it just means that in Lisp, these distinctions are carried by other means, namely the head symbol of each program form. This is arguably clearer than what happens in the often ambiguous syntactic mishmash of some other languages, where a prefix, postfix or infix operator may completely change the meaning of an expression, so that code must be read carefully and holistically before the precise semantic distinctions can be assessed. In any case, this is another case of wrong-headed arguments about Lisp based on projecting expectations that are only valid within the context of using other languages.

As for what makes parentheses necessary, it isn't, as many people ignorantly claim, a matter of prefix syntax as opposed to infix syntax. It is a matter of fixed arity versus variable arity. Prefix syntax can go wholly without parentheses, when the arity of each operator is known. Actually, the first systematic prefix syntax, the famous Polish Notation of Jan Lukasiewicz, was precisely devised as a way to get rid of parentheses altogether. The postfix variant of this notation, known as Reverse Polish Notation, has found a lot of practical uses, in FORTH, POP-2, HP calculators, PostScript, and many virtual machines. What makes parentheses necessary is the fact that Lisp has a single uniform extensible syntax that needs to accomodate for arbitrary number of arguments in program forms. As I argued before, it wasn't even a conscious design, but a historical discovery, that survived because it has many beneficial aspects. Actually, some Lisp dialects have a definite infix flavor (consider PLASMA). But what makes parentheses necessary is the fact that the same syntax needs be extensible to variadic functions and forms. The Lisp-derived language LOGO gets rid of parentheses exactly this way: by not having functions of variable arity. There could have been a paren-less sublanguage for simple operators, plus a generic parenthesized syntax for the general case, but in the early days of Lisp, such a disuniformity of syntax couldn't be afforded, and afterwards, it was found that the domains of applications of Lisp were so diverse that you could never standardize a specific ad-hoc syntax for every single fixed-arity operator that happened to be in frequent use in some or some other sub-domain. Instead, Lisp standardizes on a generic syntax that is immediately and non-ambiguously readable by any Lisper even without a priori knowledge of the arity of every operator that appears in a given domain. Lisp also allows for users to extend the language through reader-macros and macros should they want an ad-hoc syntax for their domain of choice. As for preferring a prefix syntax to a postfix syntax, macros have to be either prefix or parenthesized, and reader macros have to be prefix, so that even postfix languages such as FORTH or PostScript have special prefix syntax, and some equivalent of parentheses, although with semantic limitations (non-nestability, lack of proper scoping).

I speculate that much of the prejudice against parentheses can be traced down to the ambiguity of parentheses in C and similar syntaxes, and their general use as a marker for unusually complex pieces of code in most languages as well as in mathematical usage. People fluent in such languages and who face Lisp code once in a rare while may project such acquired emotions on what they read, despite these emotions being totally inappropriate in the context of Lisp syntax, where parentheses are a regular piece of structuring syntax, much like braces and semi-colons in C, or whitespace in Python. However, fluency with Lisp syntax is much easier and faster to acquire than fluency with C and C-like syntax. And then the emotion of parentheses in the context of Lisp becomes very similar to the emotion of whitespace in the context of Python.

Now, the same people who criticize the Lisp syntax accept without objection the syntax of the C programming language and similar or worse syntaxes such as those of C++, Java, Perl, etc. These syntaxes are presented as something one just has to learn and get used to, to be part of the elite of real programmers. Any criticism of these syntaxes gets the condescending reply to get over it and to accept the possibly admitted quirks the relevance of which is dismissed. Double standard? Certainly. The reproach concerning the syntax of Lisp is usually made by people who have seen it but never tried to use Lisp for anything meaningful. Those who have actually tried Lisp never criticize its syntax; if they try hard enough, they will soon have many valid reproaches to make to Lisp, but these reproaches all concern its semantics.

In the end, when people attack the syntax of Lisp, it is most usually but a rationalization without any technical merit regarding the syntax itself. This rationalization serves to cover up a defense mechanism against a foreign culture. It's a protective reflex against the cost of having to actually learn something new and different. The problem about Lisp here is not directly related to technical factors, but only to the fact that Lisp culture is not mainstream.

I do not want to deny that there are some technical factors that do influence the way that cultures gain, keep or lose influence. My point is that technical factors play no direct role in the measured reaction of aversion to Lisp syntax. It's just like people used to an alphabet, be it roman, cyrillic, arabic, hebrew, thai, or whatever, who will feel at unease when reading texts in a different alphabet that they are not fluent with (and then there are non-alphabetic writing systems such as Chinese characters). This unease doesn't imply any great truth about the relative superiority or inferiority of any writing system and the languages that use it, as compared to other alternatives; this doesn't preclude any such superiority or inferiority either, in any of infinitely many possible distinct relevant comparison scales.

As for non-technical factors that enter into play, and that may overwhelm any technical factor in the choice of a programming language (or of anything), I may invoke economical, social and political considerations. These considerations often have their own validity (information costs, transaction costs, network effects, established conventions, etc.), but they often may not. I claim that technically inferior solutions are often promoted based on wrong-headed considerations such as superstitions, obsolete traditions, irrational trends, ungrounded marketing, package-dealing, monopoly pressure, improper education, misevaluation of costs, political games, responsibility-avoidance, rush for what appears as an immediate solution, mindless copying, etc. (Will anyone step up to defend COBOL, FORTRAN or PL/I on intrinsic technical merit?) Does that mean that we must sit down and whine about the sorry state of the world? No. For in a free society, possessing an information that others do not possess is an economic advantage, a market opportunity. Now, of course, our societies may suffer a lot from not being free. But in as much as that's a problem worth acting upon, it is an altogether different problem that requires an action of its own; and I contend that we are free enough for leveraging the advantage of superior technological knowledge, as far as programming languages are concerned.

Comments

In the end, when people attack the syntax of Lisp, it is most usually but a rationalization without any technical merit regarding the syntax itself. This rationalization serves to cover up a defense mechanism against a foreign culture.

You are making two assumptions: 1) that this is a foreighn culture, and 2) that this is all a matter of psychological hang-ups of the "mainstream" programmers.

Well, 1) is basically an accusation in being uneducated, or simply unable to comprehend the obvious superiority of your preferred approach. Well, you are making very unwarranted assumptions about your opponents. I am a specialist in programming languages (among other things), wrote some compliers and interpreters, and consider myself rather familiar with LISP and its numerous progeny (Planner, Schema, etc). I would reiterate that any person with university CS education was exposed to LISP, and had a fair chance to learn how to use it. In any case, I do not think that this is an argument which is suitable for a civilized discussion.

The point 2) is contradictory: you are saying that people have psychological problems after insisting that languages should be compared on technical merits. The problem with LISP is not technical. It is psychological. For the vast majority of practical programmers, LISP is decidedly inconvenient. The inability to come up with some truly useful M-syntax is a part of it. Another part is that human cognition does not deal well with recursion or semantic-based recognition. This is well established fact, known to any student of psychology. That's why people prefer "flattened", less compositionally powerful environments, with rich decorative "syntax" - a textbook example is using colorful beer keg handles on control levels of a nuclear plant. Reduces mistakes, an speeds up recognition, you see.

The thrill of coming up with a clever solution for a particular algorithmic need (and LISP is great for expressing novel solutions) is quite familiar to any good programmer. However, when I code something deliverable I, like any seasoned professional, try to avoid doing anything clever, and prefer to use stereotyped, well-understood moves, which are made into reflexes by the years of practice. That's how I'm able to write good-quality code fast enough to meet the typically impossible timelines. Any "cleverness" is a recipe for disaster, and I had quite a few occasions to curse myself for trying to be clever at the expense of immediate understandability and clarity.

So, any psychological crutch (like rich syntax) helps. Coming up with good syntax is artistry and psychology way more than CS (that may explain why most languages are so horrible). It is only too easy to get carried away in any direction, from total austerity (LISP) to profligacy and noise-like compactness (APL, hmm, and to a lesser extent, Perl) or inane verbosity of COBOL. Note that even the purported citadel of syntactic purity was corrupted quickly with not strictly necessary things like quotes, square brackets, and, well, quite a few styles of macros :)

Logically, the infix notation is definitely inferior to postfix or prefix notations; but this is the conventional notation of mathematics. That's what people are being trained to use from mid-school years. Mentally translating from one notation to another is an unnecessary step, which is better done by the software, leaving people with smaller gap between their mental models and the code they need to write or comprehend. Of course, anyone is free to design a language ignoring that training, and to insist that people should rather master thinking in reverse polish notation or something - but what happens is that most people will simply move on to a language more convenient to them. To date, no language insisting on non-infix notation (or ignoring conventional operation precedence rules) gained more than a niche acceptance (Postscript may be an exception, but very few people actually write in it).

Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways. And, yeah, while you're at it, it'd be a good idea to switch to octal from decimal.

(Anonymous)

Natural Languages?

Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways.
Even that only evades the issue. Because why has hundreds of years of mathematics evolved a sub-optimal method of expression? In fact, we might well ask why (all? most? many?) natural languages are complex context-sensitive beasts. Are we merely at the mercy of the very first grammarian, or is their something in our primitive brain which is wired to be more receptive to complex grammars?

Re: Natural Languages?

(1) The usual mathematical notation with lots of symbols in an infix way makes a lot of sense when you write on a 2D piece of paper or blackboard. But it's just not adapted to text-based programming.

(2) Grammars are not complex, but big. That is, each rule is simple, but there are lots of rules. A massively parallel brain can cope with that. That is, as long as it has to deal with similar stuff.

(3) Programming is different from the usual stuff. Semantics is part of programming, and it's part of what any programmer must master so as to write decent programs. Some languages make it easier to discuss semantics than others.

(4) Lisp doesn't prevent from learning lots of rules. Actually, in Common Lisp (as opposed to say Scheme), there is a tremendous lot of rules. But these rules are in the semantics, not in the syntax. So that the programmer will focus on what matters rather than be diverted in details.
1) The screen in front of me is flat 2D thingie with somewhat worse image quality than printed paper. Did I miss something? And, yes, it is perfectly capable of displaying math notation, as evidenced by the snippet of paper on the LQG in the adjacent window :)

2) The human brain is massively parallel. And it has the benefit of some billions years of evolution in dealing with large detailed spatial environments. Which do not have any recursion to speak of, are not regular or repetitive, and cannot be combined in any meaningful way. The most important thing is to recognize them by multitude of visual, aural, olfactory or tactile cues - and so not to get lost.

The grammar-capable brain regions are relatively recent, 100k years or so. They're small, too (check Broca's area on a brain anatomy map, for example, and compare with the size of visual cortex (occipital (V1-V4) and inferotemporal lobe). No natural language has anything like deeply nested structures (interestingly, the most "recursive" language is Vietnamese :)

The simple truth is people are very poor thinkers when faced with recursive structures. The 1929 Goedel's results could be well accessible to Aristotle or Mohammed Al Khorezmi, but it took, well, millenia to come with what, in restrospect, seems to be the fundamental but rather simple statement about the foundations and meaning of logic.

Most practical programming projects are, in fact, very similar. There's quite a limited set of movements, algorithms or tricks an "average" programmer has it his bag. Those are practically always sufficient to get the job done.

3) Programming is different from the usual stuff. Yes, it is. It requires people to perform unnatural tasks, all day long. That's why it is much harder than, say, driving. But, at the end of the day, - both professions are about telling machines what to do.

The result is obvious - most people are extremely poor programmers, and even good programmers are apt to make frequent mistakes. A driver with comparable error rate would get himself killed on the very first trip.

Some languages make it easier to discuss semantics than other.

Programming is not about discussing - it is about translating intent into code. The descriptive power is often counterproductive - replacing mechanically learned movements with deliberations and reasoning is a sure way to sabotage the actual task - namely getting from "we want this and that to be done" to "computer: do this and that".
4) Syntax matters. The semantics of all useful languages is identical. Anything which can be written in LISP can be written in C or assembler, or FORTRAN, for that matter, and vice versa. Most likely in a roughly comparable number of tokens, on average. The only difference between languages is, by and large, the syntactic sugar, and the amount of dirty work the language compiler/interpreter and run-time do for the programmer.

LISP per se isn't very semantics-rich language. You can write libraries which do a lot of things in it, but you can do that in any other language as well. If we're talking about semantical power, the "best" language is, of course, Unix shell. It has an entire mature OS backing it. Still, it is only used as a glue, as it sucks in other aspects. (Funnily, at some time I used LISP as a shell, on PDP-11, just for the heck value of it... but in the end I went back to csh - brevity is important in a command-line interface :)

There are other things besides syntax which matter - like, restrictions. The protection of programmer against his own mistakes is an important function of languages, but that, by necessity, limits his freedom of expression. Or take inability to write self-modifying code. From the purely expression-power point of view it is a shortcoming. From the point of view of writing debuggable, robust and reasonably secure software - it is a significant advantage. And the significance of these restrictions grows with the size and complexity of the programming project - nobody in his right mind would do anything as complicated as OS, air traffic control, or (yes) loan processing in an average bank in any language which allows programmer to change module interfaces dynamically.

Cognitively, you don't want a shared terrain to change features rapidly - it will make any kind of collaboration impossible. Of course, this can be achieved with some degree of self-discipline in any language, but it is much better to have discipline actually enforced. You can do that in LISP by insulating a programmer from the core language, but all you get this way is the same C++/Java/Ada, but with verbose syntax.

(Anonymous)

< you > < are absolutely="right" > < people > < dont > < like deeply="nested" > < expressions > < / expressions > < / like > < / dont > < / people > < / are > < /you >
When was the last time you heard someone who liked XML?

XML was meant to be easy for a machine to parse, not for a human to read. (Humans were supposed to use tools if so necessary.) Further, it was meant for data, not code.

(Anonymous)

"XML documents should be human-legible and reasonably clear." - The design goals for XML (http://www.w3.org/TR/REC-xml/#sec-origin-goals)

XML programming languages...

Scary as it may seem, some people program in XML. I myself had to use XSLT once... ouch. Plenty of other such languages exist, including Water, MetaL, o:XML, XL, XS, etc.

(Anonymous)

See also: http://lemonodor.com/archives/001096.html

LISP in college?

I utterly disagree with you that people are exposed to modern lisps in college. Just about every CS graduate I've talked with expressed a distaste for lisp because: 1) it's only interpreted, 2) lists are the only datastructure, and 3) it's only about recursion. All of these are demonstrably false, yet the myths persist because of the half-assed way it's presented.

Your "reflexes after years of practice" sounds like code that you should only have to write once. Lisp lets you do that.

Infix vs prefix notation for mathematics is a red herring. Any mathematical expression complicated enough to cause confusion in one notation will still be complicated when converted. In practice, it's a non-issue.

(Anonymous)

Re: LISP in college?

It really does say something (bad) about colleges that teach: 1) that it's interpreted (although the point of a Lisp interpreter in Lisp is not about interpreters); 2) they think it's "only about recursion", when even chapter 1 of SICP explains the difference between recursive /procedures/ and recursive /processes/ vs iterative /processes/, a difference that, eg, C blurs all too well, to the point of causing mental handicap and blindness.
Maybe some have difficulty in learning these concepts, but you can't blame Lisp for that...

(Anonymous)

Write it yourself

If you have such a problem with postfix math, then write an infix to postfix read macro for Lisp. That's what I did for a little exercise playing with the reader macros. It was buggy but worked. I could write the following:

(print {2 + 3 * 4})

and get 14 printed. So get to work and learn something like: infix is a PIA to implement.

(Anonymous)

Re: Write it yourself

What kind of response is "write it yourself"? Why not use a programming language that is actually bundled with a non-trivial parser so you don't have to write your own?

True cleverness

You are making two assumptions: 1) that this is a foreighn culture, and 2) that this is all a matter of psychological hang-ups of the "mainstream" programmers.

Well, 1) is basically an accusation in being uneducated, or simply unable to comprehend the obvious superiority of your preferred approach. Well, you are making very unwarranted assumptions about your opponents. I am a specialist in programming languages (among other things), wrote some compliers and interpreters, and consider myself rather familiar with LISP and its numerous progeny (Planner, Schema, etc). I would reiterate that any person with university CS education was exposed to LISP, and had a fair chance to learn how to use it. In any case, I do not think that this is an argument which is suitable for a civilized discussion.


Just because you were exposed to something in university doesn't mean you know very much about it. I spent an entire semester doing Smalltalk and C++ as a graduate student, and was cited as the best Smalltalk programmer by the TA. It wasn't til years after that I realized that I knew nothing about doing good Smalltalk, and that my class was basically useless. (I have been doing Smalltalk now for 9 years.)


Of course, this can be changed. Just rewrite math textbooks, convince educators and everyone dealing with formulae to change their ways. And, yeah, while you're at it, it'd be a good idea to switch to octal from decimal.


Here, you are admitting that it *is* cultural. As a practical matter, it is important. However, this statement doesn't show much self awareness as a programmer.

Any "cleverness" is a recipe for disaster, and I had quite a few occasions to curse myself for trying to be clever at the expense of immediate understandability and clarity.


I call to question your "cleverness" then. True cleverness results in greater clarity while reducing the repetition of code.

It sounds like you repeat lots of code.
It's a protective reflex against the cost of having to actually learn something new and different. The problem about Lisp here is not directly related to technical factors, but only to the fact that Lisp culture is not mainstream.

Many of us have actually forgotten LISP (heh, I wrote some stuff in in for BESM-6 when I played with language transforms in the university). Pascal and LOGO, too. It is not new, by any means. No doubt, C is a glorified assembler, C++ is ugly and sometimes infuriating, and Java is a fascist's wet dream, but somehow they're still the most practical tools for doing things like operating systems, object request brokers, databases, routing software or (oh, horror) business apps. I wish it could be better, as there is clearly a lot of room for improvement (my pet peeve is that Algol-68 was undeservedly forgotten - it had lots of interesting ideas in it, some of which found their way into variety of languages; and, besides, I worked on a A-68 compiler project, which makes me impartial :) However, insisting that something which was created long time ago, well known, and found lacking by the only people whose opinions should matter for the algorithmic language research community - namely application programmers, does not strike me as a particularly constructive approach to improving the situation.

(Anonymous)

Programmable programming language?

For me, the problem with lisp syntax is not that it's unreadable and that parentheses are weird (in fact, I find them to be a very elegant idea) but I always think about the programmable programming language thing. It's true, Lisp is flexible, but at what cost? At the cost of having no syntax at all. The point is that lisp does not make user-made extensions (functions, macros) as powerfull as all the other language constructs, it makes all the "default" language constructs as powerfull as user ones. It makes you "customize" the language just because everything in the language is "like" user-made functions. Of course doing that has its advantages, no other programming language can be as metaprogrammable as lisp, so it's a matter of taste. There is TCL, that's really close to lisp, but without parentheses and with a nice "trick" to eval infix syntax for mathematical expressions (and I wonder why usually lispers does not seem to care much about it). There are some C-like languages for example that have lisp-like macros, but they are even less flexible. There are some easily extensible compilers and interpreters, that don't use macros at all (the extension is done at compiler-level, it's not a language feature) like Lua... So it's only a matter of choice, you have to choose the tradeoff between a rich and nice syntax fitted only for one programming paradigm and a language that can be adapter to various paradigms but that does not support any of them with a syntax that's suited just for them. I don't think that lispers are right when they say that lisp is a superior language with no flaws, even if we restrict the discussion only about its syntax and don't go into other matters (dynamic vs static, low vs high level, standard library etc...) - Angelo

(Anonymous)

fixing lisp

(deffun foo()
(let ((a 1) (b (+a 1 2) (c 3))
(if (= a 1)
(setq c 2) (progn

(Anonymous)

fixing lisp

(deffun foo()
--(let ((a 1) (b (+a 1 2) (c 3))
----(if (= a 1)
-------(setq c 2)
-------(progn
----------(setq a 1)
----------(do ((i 1 (+ 1 i)))
-------------((> i 5))
-------------(print i)))))))

(unsure of correct amount of closing parentheses)

That above program could be written with the exact same syntax using python-like whitespace conventions as:

deffun foo()
--let a 1 b (+a 1 2) c 3
----if = a 1
------setq c 2
------progn
----------setq a 1
----------do i 1 (+ 1 i)
------------> i 5
------------print i

the if struct could be clearer with python/ruby/vb conventions of using then or : and else keywords, but any S-expression can be written with minimal parenthesis by using newline and indentation (for scope) conventions from python. The only way lisp is readable is with these conventions anyway, but its mind boggling why we have to type them in... when a pre-processor could do without.

The other improvement to lisp parsers would let you right the if form as (if (exp) :then (doiftrue) :else (doifnot)) -- and no reason, since the terms must be ordered, to not be able to parse (if exp : doiftrue : doifnot). While if tends to be readable without the extra sugar, the do form is a nightmare (do (varsetup) :untilcond (exp returnval) :body (...)) would clean it up. There is no more reason to work with pure s-expression formatted code then there is with xml. Its totally unacceptable to force the lisp absurdity on the user.

The world has not moved to python because it executes 10 times slower than lisp, or hasn't bothered to implement lisp macros yet, it went there because the language is sane, and author has a basic respect for making the language useable.

(Anonymous)

Re: fixing lisp

Try this:
http://www.dwheeler.com/readable/

(Anonymous)

Re: fixing lisp

That's still not quite right, since you can't a variable bound in the same let for defining another one within the same let-head, since they are bound "in parallel":
(let ((a 1)
(b (+a 1 2)) <- This will fail
(c 3))
You could however use let* for that kind of thing.

Rationalization

In the end, when people attack the syntax of Lisp, it is most usually but a rationalization without any technical merit regarding the syntax itself. This rationalization serves to cover up a defense mechanism against a foreign culture.

I agree heartily, and said the exact same thing in my essay What I've Learned From Sales, Part I: Don't Feed the Trolls (http://weblog.raganwald.com/2007/01/what-ive-learned-from-sales-part-i.html).

As Greg House would say, "Everybody Lies," and when it comes to rationalizing ways to avoid doing anything that isn't mainstream, everybody tells the same lies, over and over again :-)
eyes black and white

August 2014

S M T W T F S
     12
3456789
10111213141516
17181920212223
24252627282930
31      

Tags

Powered by LiveJournal.com