Extreme syntax

In his book Let Over Lambda, Doug Hoyte says

Lisp is the result of taking syntax away, Perl is the result of taking syntax all the way.

Lisp practically has no syntax. It simply has parenthesized expressions. This makes it very easy to start using the language. And above all, it makes it easy to treat code as data. Lisp macros are very powerful, and these macros are made possible by the fact that the language is simple to parse.

Perl has complex syntax. Some people say it looks like line noise because its liberal use of non-alphanumeric characters as operators. Perl is not easy to parse — there’s a saying that only Perl can parse Perl — nor is it easy to start using. But the language was designed for regular users, not beginners, because you spend more time using a language than learning it.

There are reasons I no longer use Perl, but I don’t object to the rich syntax. Saying Perl is hard to use because of its symbols is like saying Greek is hard to learn because it has a different alphabet. It takes years to master Greek, but you can learn the alphabet in a day. The alphabet is not the hard part.

Symbols can make text more expressive. If you’ve ever tried to read mathematics from the 18th or 19th century, you’ll see what I mean. Before the 20th century, math publications were very verbose. It might take a paragraph to say what would now be said in a single equation. In part this is because notation has developed and standardized over time. Also, it is now much easier to typeset the symbols someone would use in handwriting. Perl’s repertoire of symbols is parsimonious compared to mathematics.

I imagine that programming languages will gradually expand their range of symbols.

People joke about how unreadable Perl code is, but I think a page of well-written Perl is easier to read than a page of well-written Lisp.  At least the Perl is easier to scan: Lisp’s typographical monotony makes it hard to skim for landmarks. One might argue that a page of Lisp can accomplish more than a page of Perl, and that may be true, but that’s another topic.

* * *

Any discussion of symbols and programming languages must mention APL. This language introduced a large number of new symbols and never gained wide acceptance. I don’t know that much about APL, but I’ll give my impression of why I don’t think APL’s failure is not proof that programmers won’t use more symbols.

APL required a special keyboard to input. That would no longer be necessary. APL also introduced a new programming model; the language would have been hard to adopt even without the special symbols. Finally, APL’s symbols were completely unfamiliar and introduced all at once, unlike math notation that developed world-wide over centuries.

* * *

What if programming notation were more like music notation? Music notation is predominately non-verbal, but people learn to read it fluently with a little training. And it expresses concurrency very easily. Or maybe programs could look more like choral music, a mixture of symbols and prose.

41 thoughts on “Extreme syntax

  1. Benoit Hamelin

    Interesting post, given how these days I am reading a lot on APL and the family of “array languages” it has spawned. It is true that APL, as well as its modern incarnation J (http://jsoftware.com/), yield programs made mostly of symbols, which can harder to read than to parse. Syntax-wise, J is a bit like Lisp, in that there is rather little of it. J’s complexity lies in the multiple semantics of its symbols, but that’s another story, as you say John.

    Here’s a perspective on these complicated array languages from Keith Smillie, who apparently used them throughout his carreer: http://webdocs.cs.ualberta.ca/~smillie/Jpage/MyLife.pdf

  2. Benoit Hamelin

    Also, I completely agree with the music notation analogy. Even complicated languages can be wrangled, given time and training. This is like the more complicated (some read that as “idiosyncratic”) editors out there, like Emacs and Vi[M]. All these things can be made to sing, and the music one makes out of them informs and influences the further musical endeavors of the player.

  3. There’s a lot to your post and I’m not quite sure I follow how you got from the first paragraph to the last, but there is one point I’d like to make. You use the term “Lisp” in the same way is you use “Perl” — to reference a X vs. Y relationship between syntactic approaches. However, “Lisp” is a family of languages whereas Perl is a language. Various Lisps have tackled the syntax “problem” and have landed on various manifestations from Dylan to Common Lisp to Shen to Clojure. It’s unfair to say that Lisp has a “typographical monotony” because that assumes that all Lisps syntaxes are the same; which is simply not true.

  4. One other point if I may:

    > and these macros are made possible by the
    > fact that the language is simple to parse.

    That’s not entirely true. Lisp macros are facilitated by the fact that (most) Lisp programs are composed of the same data structures that the language itself deals with. Therefore, processing and transform a Lisp program via macros is done by the same exact tools (e.g. car, cdr, cons, etc.) found in the language itself — nothing special is needed. Many Lisps prefer to minimize the conceptual distance between the syntax and the data structures used to represent the program, but that’s not to say that the syntax itself helps macros. It might help a programmer visualize the macro transformations more easily, but macros couldn’t “care” less about that.

  5. I don’t know much about Lisp, but what I’ve seen of Common Lisp, Scheme, and Emacs Lisp, I’d say “typographical monotony” applies.

    Clojure is a little different. I liked what I heard Rich Hickey say somewhere about how traditional Lisp overloads symbols unnecessarily but that Clojure uses different symbols for different things. I imagine it would be easier to scan a page of Clojure than a page of Scheme, for example.

  6. I think with Lisp it depends on how you name you fns & API. When I’m using well developed Clojure libraries and name my fns in some readable way I can produce code that is almost sentence-like… if you kinda get used to all the parens around, you can then read the code like prose… (and I think this applies to CL too)
    With Perl, it’s just too maximalist for me :)

  7. While your primary argument is about syntax, there is another issue just below the surface of your post: a language’s level of abstraction. APL is designed to operate on arrays, while Perl, like just about every other mainstream language, deals with collections of data items one at a time.

    There have been set-at-a-time languages for many years, going back at least as far as APL, but they seem to have trouble with adoption. The major such language is, of course, SQL. This issue has been studied, a little, in “Human Factors Comparison of a Procedural and a Nonprocedural Query Language”, by Charles Welty and David W. Stemple, ACM TODS 6.4 (1981). (There may be later papers addressing this point, I’m not sure.)

    Lots of developers hate SQL, and this explains much of the momentum behind the NoSQL approach. Many developers can’t write SQL at all, or can use it in only the most rudimentary way. I think that there are two aspects of SQL that explain this. First, SQL is somewhat algebraic — queries operate on tables, and query results are themselves table-like. (The underlying relational algebra has this idea in a much clearer form.) Second, people don’t get joins. I see many programmers who are much more comfortable with the idea of iterating over the first table, and then using data from the current row to go and get more rows from the second table. Which is consistent with the Welty and Stemple paper.

    Another aspect of this discussion is languages vs. libraries. There is really no reason that set-at-a-time processing cannot be done, at least to some extent, in existing languages with collection libraries. Yet these libraries are primarily geared to iteration, not set-at-a-time operations.

  8. I thought about saying something about regular expressions in the post. Regular expressions are hard on the eyes, but I’d say three things in defense of regular expressions.

    1. Regular expressions are dense. Equivalent procedural code would be less dense but possibly more work to comprehend.
    2. Regular expressions are much easier to understand when broken up with whitespace and commented, as with Perl’s /x option.
    3. The biggest thing that makes regular expressions difficult is the small, overloaded set of symbols. Using a bigger palette of symbols and representing distinct ideas with distinct symbols would help.
  9. I think music notation has a simpler job. Which notes, when, how long and what volume. Maybe assembly would be a closer analogy. Programming notation has to capture new concepts, and concepts on top of concepts.

  10. The music analogy is interesting. With music there is a consistent time against which the symbols are written and played. In imperative programming, statements and expressions are explicitly sequenced. In functional programming, the relationships are functional and the sequencing is implicit (with provisions to be explicit about sequence when necessary).

    In databases, the trend is toward time being a first class construct so that the state of the db at any point in time is immutable (see Datomic). Never overwrite. In music, a section may be repeated, usually with written or interpreted differences but time is a linear arrow in one direction. There may be a different version of a previous thing but it is later in time so it can’t change the first.

    Sean McGrath is doing interesting work along these lines with legislative documents and processes – http://blog.law.cornell.edu/voxpop/2012/11/16/digital-law-what-lawyers-need-to-learn-from-accountants/.

  11. > I don’t know much about Lisp

    Well, I don’t know much about Perl, so can I say “orgiastic typography?” ;-)

    Also in defense of Clojure (a Lisp):

    1. Clojure is dense. Equivalent procedural code would be less dense but possibly more work to comprehend.
    2. Clojure is much easier to understand when broken up with whitespace and commented.
    3. Clojure uses a bigger palette of symbols and literals for representing distinct ideas.

    See what I did there?

  12. In Perl6 there are at least a few (non-ascii) Unicode operators specced out (« »). Although they (so far) all have ascii equivalents (<>).

    http://tablets.perl6.org/tablet-4-operators.html#hyperops

    Perl6 also allows you to easily add operators.

    sub postfix: { [*] 1..$^n } # 1*2*3*4 … $^n

    say 5!;

    http://www.perlfoundation.org/perl6/index.cgi?amazing_perl_6

    sub prefix:{ [+] @^n } # sum (Is my use of ∑ correct?)

    my @a = 1..5;

    say ∑ @a;

    say ∑ (@a,5);

    Actually that is how most of the language is written in the Rakudo Perl6 compiler.

  13. Ronan: Good point.

    I believe Perl 6 is much better in that regard. It has a language specification and multiple implementations. And I believe the first Perl 6 parser was written in Haskell.

  14. LISP’s parentheses seem to put many people off, but there’s no reason we can’t ‘ghost them out’ into the background and use indentation to show the structure (there are some pre-processors which allow us to throw away the parentheses entirely, but that’s not needed if we have an editor with auto-indenting and syntax highlighting).

    If we do this, we get a screen filled with symbols just like most other languages. In this setting LISP still has some parsing quirks for new (English-speaking) users; for example prefix notation and higher-order functions often end up with code that’s ‘backwards’ or ‘inside-out’. An example of the former would be “x is less than y” being written as “(< x y)" rather than "(x < y)"; An example of the latter would be anything in continuation-passing-style.

  15. What makes Perl syntax so difficult isn’t the symbols, it’s that you can’t even know what Perl will compile to without executing it: http://www.perlmonks.org/?node_id=663393

    So you lose the ability to reason about it at all, unless you’ve managed to restrict yourself to a subset of Perl, and that’s definitely not something people are willing to do.

    The symbols aren’t a problem at all, since APL and J languages are very symbolic, yet also don’t require running code to understand it.

  16. Warbo: I agree that with well-indented Lisp, you can almost dispense with the parentheses. But I disagree that Lisp code minus parentheses looks like other programming languages. There’s less visual variety.

    Evan: In practice most Perl is not nearly as exotic as the language allows.

  17. I use Perl a lot since 10 years. When I was student (1990), I was using a lot of functional languages (scheme, yafool, caml). The evolution of functional languages is toward the introduction of types and the introduction of syntactic sugar (like in haskell or scala).

    When programming, it is not very annoying to type in all the parenthesis of lisp (scheme, clisp, elisp, clojure, …), but when you need to read the code, it is a real pollution.

    In Perl, when you read the code of someone else, you can feel his way of thinking. When you program, you have a wide choice of constructions to structure your solution.

  18. In python3/perl6 you can have unicode for function names.

    But the implementation is just vomit.

    \sigma style syntax, images?! for vars and functions. support postfix/infix
    notations in method overloading will take a long to make a programming
    language a “notation for notations”

    You can have prefix unicode in lisp but lisp programmers will never
    accept it because they think indentation is more meanigful than notation.

  19. @John thank you for your thoughtful and well-considered response to my reply to your blog. I do respectfully disagree with your statement that “In practice most Perl is not nearly as exotic as the language allows”, and even if it were true it wouldn’t actually be relevant since we’re discussing the reasons Perl is difficult to read and parse (which is a mechanical form of reading), and not the common usage of Perl.

  20. Evan: It’s great that you “respectfully disagree.” When I started blogging, I was reluctant to allow comments. But I’ve been surprised at how civil and thoughtful the comments usually are. I really appreciate the feedback.

    That leads to my point of worse-case vs. average-case behavior. The worst-case behavior of comments is atrocious, but average case is very good. Likewise most Perl code (or code in any language) stays well away from the limits of what the language permits. As the old saying goes, you can write Fortran in any language. :)

    I agree that worst-case behavior is important. Language implementers and tool designers have to consider what is permissible rather than just what is common. (Optimizers, however, can indeed look at what’s common. That’s the secret behind tremendous speed-up of dynamic languages like JavaScript.)

    Another angle is mistakes. One problem with Perl is that even if I don’t intend to use some of its more exotic features, I might use one of them by accident.

  21. @John

    > One problem with Perl is that even if I don’t intend to use some of its more exotic features, I might use one of them by accident.

    *by default*

    http://markmail.org/message/h2spyi5za4qheuft
    (state of Modern Perl I tell you)

    For some reason, Perl programmers think it’s the syntax that drives people away from it.

    Yet I find *tons* of python pogrammers happily writing bash scripts.

    *Bash is predictable*

    The number of **quantum traps** that perl programmers put up with is *amazing*

    http://blogs.perl.org/users/rurban/2013/02/no-indirect-considered-harmful.html#comment-370624

  22. I remember APL on the Honeywell system I used in high school (it was at a local college). That was 1977 and when we went to the college we could see the “APL terminal”. It was a very expensive tektronix system with a vector display and that wild APL keyboard, there were two for the entire college. Everyone else used either an ASR-33 or a LA36 decwriter of which there were well over a hundred. That there were only two APL terminals was a bit part of its lack of use. You could get a strange ASCII version of your APL program but between the APL symbols in the manuals and then the conversion to ASCII it just was too much. FORTRAN or BASIC was the way : )

    Checking the web, it most likely was a Tektronix 4015, they were almost 9 grand in 1974 dollars!

  23. I remember reading an interesting HBR article (unfortunately it seems now behind a pay-wall) regarding the Mumbai dabbawalla system.

    The tin cans used to carry the lunches had a couple of different symbols.

    The most frequently used symbol was a somewhat cryptic but very short alphanumeric code. (If you use it a lot you learn it quickly).

    The other (less frequently used) symbol was somewhat longer and less cryptic.

    The conclusions are fairly self-evident: Constructs that are used both widely and frequently can (should) be shorter (even at the expense of being more cryptic). Constructs that are used more infrequently should be less cryptic.

    Now that we have a considerable body of code available for analysis through repositories such as github, I wonder if we could do a simple analysis for the use of particular language constructs or APIs, with the goal of identifying constructs that could have been made shorter (or longer)?

    I agree with the bit about typographical monotony – although my Lisp experience is virtually non-existent, I lean on spatial awareness quite heavily to navigate through large source files, and tend to format my code to leave landmarks that I can easily spot in Sublime Text’s outline view.

  24. Ah, but more frequently used by whom? A good fit is a language whose community does what you do, and makes your tasks convenient.

    I saw a presentation by Damian Conway last year where he argued that Perl usage has changed quite a bit over the years and that Perl 6 is designed to make things that people do now more succinct.

  25. “Constructs that are used both widely and frequently can (should) be shorter (even at the expense of being more cryptic). Constructs that are used more infrequently should be less cryptic.”

    Larry Wall (the king of Perl) frequently refers to this principle when decided which symbols/tokens to use for particular operations, saying “Each symbol has to justify its existence according to Huffman coding.”

  26. I’ll just leave this here: http://www.neverworkintheory.org/?p=197

    There should be further study, as the sample size was low, but it concludes that a fixed random syntax isn’t worse for writing correct programs than Perl’s syntax, and that a completely new language designed for easy syntax is better than both random and Perl syntax.

  27. I agree that language designers have to consider what is permissible, but so do optimisers! A static compiler may spot many potential optimisations in a program but refuse to perform them because some language feature would make it semantically incorrect. JIT compilers suffer this too, but not so much since they have more information to hand.

    For example, it might make perfect sense for my application to store its configuration settings as literal values, but underneath several layers of indirection. Now let’s say those values never change, so it’s valid for the optimiser to strip away the indirection and plonk the values verbatim into all the myriad places they’re referenced. For kicks, these would be spotted by subsequent passes and lead to the elimination of lots of dead branches (for debugging, platform-specific fixes and missing-dependency workarounds) and speed up all manner of things.

    However, the compiler refuses to perform this optimisation because it can’t prove that I never use reflection to re-route the the indirection layers, even though I don’t.

    A lot of languages like to add features willy-nilly, assuming that it’s new stuff and therefore won’t lead to problems with existing code, however this is simply not true. If a language adds a new feature, let’s say exceptions, then it can undermine many of the assumptions of existing code; in this case it would break code like “setup(); foo(); cleanup();”, since “foo” could throw an exception and lead to “cleanup” never being called.

  28. I’m not convinced that lisp is syntactically monotonous; it uses solely parentheses for syntactic grouping, but most languages use only () and {}

    As far as non-alphabetic characters, it perhaps has a few less than C-like languages built-in:

    http://www.lispworks.com/documentation/HyperSpec/Front/X_Alph_9.htm
    http://www.lispworks.com/documentation/HyperSpec/Body/02_dh.htm

    but it’s not uncommon for libraries to define other characters for the domain at hand to have meaning when appropriate (which is perhaps closer to how mathematics works than perl).

  29. John: okay then, here’s an anecdote. I once spent a week trying to learn Perl (I knew basic, Scheme, and C/C++ at the time). I failed. A few days after failing at Perl, I picked up Python in an hour, and have been using it for 14 years.

    To me, even if many Lisp variants have annoying paren issues, I can spend time and sus out what is going on. I can’t guarantee equivalent success when staring at Perl.

  30. Re: APL. One of the biggest issues with APL adoption (other than the esoteric symbol set in an era where most terminals definitely couldn’t show them) was the fact that the language itself wasn’t very extensible unless you were an APL maintainer yourself.

    My experience with APL was just that it was a great language as long as you wanted to string together the pile of combinators it had, but you could hardly bring yourself to write any composable code in it.

    You could write functions in APL, but they didn’t feel like APL. They wound up with verbose non-symbolic names, they were often quite jarring when juxtaposed with the base primitives supplied by the language.

    Guy Steele gave a seminal talk on “Growing a Language” back in 1998 — you can find a video of it on youtube: http://www.youtube.com/watch?v=_ahvzDzKdB0 — and while I may disagree with some of the particulars of what constitute’s “enough” functionality to permit user extensions to feel natural, APL definitely fell short of that mark.

    I’ve personally moved on from perl to haskell, which definitely has fewer ad hoc symbols in its syntax, replacing them by allowing library authors to define new symbols for their own combinators. Nicely this means the mechanism for looking up a sigil is basically the same as looking up any other function, and there is less magic and more regularity to the language.

    This breaks the APL ‘feel’ threshold and when combined with laziness and lambdas for binding constructs allows you to make pretty decent DSLs, but doesn’t quite go to the extremes of lisp/scheme macros in making the syntax be whatever you want — as long as what you want has a lot of ()’s in it. ;)

    That said, perl’s choice wasn’t bad. They chose to mark up user defined variables, which left them open to throw new syntax at the language and remain almost completely backwards compatible. One could envision an alternate universe where a perl was written where user variables remained unmarked but all the keywords picked up sigils instead, and it’d still maintain that desirable property.

  31. Lisp has a two-level syntax. S-Expressions are a data syntax. They encode symbols, numbers, lists, vectors, characters, strings, and a bunch of other data types.

    The second level of Lisp syntax is on top of s-expressions. This is the syntax of the programming language Lisp. Not every s-expression is a valid Lisp program.

    The syntax to define a function is:

    defun function-name lambda-list [[declaration* | documentation]] form*

    The syntax to define a class is:

    defclass class-name ({superclass-name}*) ({slot-specifier}*) [[class-option]]

    with a lot of subrules. Since Lisp has special forms and macros, these all implement syntax. From relatively trivial, to very complex like the LOOP macro. But these are word based syntaxes, not character symbol based ones.

    I have little doubt that one can learn to easily parse indented word based programs. Otherwise we were not able to read the huge amount of text like we do. But whenever we read a long book, we train our reading. Once you have read a lot of Lisp, it looks like prose. It’s a bit like riding a bicycle. It looks difficult at first, but it is hard to unlearn later.

    As a Lisp programmer you need to learn to read that second level syntax.

    Generally this is not that difficult, since the human vision system and brain are very powerful. With training humans can read mathematical expressions, chinese symbols, latin and all kinds of other notations. We can even read through 1000+ page novels without much visual clue while reading.

    After a while, Lisp code like http://www.norvig.com/paip/macsyma.lisp is not hard to read.

  32. Ahh, APL. I stumbled across this long ago, and am still fascinated by the combination of “fight” and “flight” responses it triggers:

    ‘Tis the dream of each programmer,
    Before his life is done,
    To write three lines of APL,
    And make the damn thing run.

  33. ========================

    Fogus: “Lisp” is a family of languages whereas Perl is a language.

    ———

    Perl 6 seeks to facilitate the large family nature that has been realized in Lisp.

    I’ll first cover some familial levels that don’t correspond to what you mean, but which I think are useful to note, then arrive at the Lisp level.

    First, ‘”Perl” is a family of languages’ is a quote from perl.org/about. I think this is mostly intended and interpreted as Perl 5 and Perl 6 being siblings but it still needs to be said.

    Second, Perl 6 is a spec and there are multiple implementations. Again, not what you meant.

    Third, in its leading implementation (Rakudo) Perl 6 has spawned a child language called NQP. NQP is basically a bundling of the object metamodel (CLOSish), a regex/grammar/parser engine, a small lang that is a subset of the full Perl 6 language, and some libraries. NQP is aimed at building compilers, most notably itself (NQP is written in NQP) and a Perl 6 compiler.

    Still not what you meant.

    Next, the Perl 6 language design is structured as a collection of “slangs” (think of either a “sublanguage” or a variant of a language). Slangs are composed, one for strings, another for regexes, etc. into an overall main language slang. (Slangs are built from grammars which are classes in disguise so inheritable and delegatable etc.) This is still not what you meant, but maybe you can see where this is headed.

    Slangs and several other features are designed to enable Perl 6 to become Perl 7 or My Perl, and indeed to enable any lang, from DSLs to other computer languages, to be expressed in Perl 6 and inlined as desired.

    So, Perl 6 is not yet a family of langs at the level you meant, but (NQP and) Perl 6 is intended to facilitate relatively rapidly emergence of a large lang ecosystem if/when Perl 6 itself is sufficiently complete and performant.

    ========================

    Fogus: processing and transform a Lisp program via macros is done by the same exact tools (e.g. car, cdr, cons, etc.) found in the language itself.

    ———

    Ditto for Perl 6.

    Unfortunately the spec is extremely terse and won’t make much sense if you don’t know Perl 6. Worse, both the Rakudo implementation and informal user doc is incomplete/immature. Maybe this isn’t actually worse than nothing:
    https://gist.github.com/masak/abd360db840a0719da2f

    ========================

    Jack Orenstein: Perl, like just about every other mainstream language, deals with collections of data items one at a time.

    ———

    The Perl 6 spec has much to say about dealing with composite data structures at a high level; the implementations a lot less so.

    See http://rosettacode.org/wiki/Set#Perl_6 for some basic set operations that have been implemented.

    ========================

    John: The biggest thing that makes regular expressions difficult is the small, overloaded set of symbols. Using a bigger palette of symbols and representing distinct ideas with distinct symbols would help.

    ———

    Perl 6 reinvents regexes as Grammars. This is a fundamental change. This is a (alternatively a collection of) regex: https://github.com/moritz/json/blob/master/lib/JSON/Tiny/Grammar.pm

    ========================

    anonymous: the implementation [of Unicode in code] is just vomit.

    ———

    See http://rosettacode.org/wiki/Unicode_variable_names#Perl_6 for some examples.

    I have long been rooting for Larry to more carefully consider dropping the prefix ‘\’ that is required when introducing “sigilless” terms. See the third alternate here: http://rosettacode.org/wiki/Ackermann_function#Perl_6 for what I’m talking about.

    ========================

    Almost no one is using Perl 6 yet. It’s leading implementation, Rakudo, is still immature. I am thinking this will begin to change next year or maybe 2015, some time after Larry Wall (creator of Perl) publishes his first Perl 6 book.

  34. There is a language with less syntax than Lisp. Forth literally has no syntax. You “parse” by reading tokens (including numbers) from left to right. It also has very interesting “macro” capabilities (called “defining words” in its lexicon).

  35. @Shimshon Indeed FORTH is a great starting point for building up domain-specific languages from pretty-much nothing.

    We can say that LISP has two syntactic devices: juxtaposition for building lists and a Dyck-like syntax for recursing. FORTH only uses juxtaposition for building lists; it achieves recursion using its stack semantics. This is similar to how we can implement recursive algorithms using loops: implement our own call stack using a list-like datastructure.

    There’s an interesting language called Joy which closely resembles FORTH syntactically and semantically, but is actually a purely-functional language. It supports quotation using Dyck-like syntax, so a Joy parser would actually look more like a LISP parser than it would a FORTH parser.

  36. one thing few people ever mentions is Mathematica.

    It has syntax that is more regular than lisp, yet its syntax variation is richer than perl is theory and practice.

    lisp syntax isn’t 100% regular, e.g. quote ‘(1 2 3), dot for cons (a . b), comment semicolon, and #|comment|#, and evaluation/macro directives using backtick, comma, and “vector” as […].

    Mathematica is more regular, all of the form f[].

    yet it’s got a SYSTEMATIC transformation layer. For example, Plus[a,b] is same as a+b, Map[a,b] is same as a/@b, List[…] is same as {…}, etc. Prefix operator is @, postfix is //, infix is ~Name~. So, the syntax variation, is actually more colorful than Perl, yet systematic.

  37. Ignoring the misnomer of “typographical monotony” I believe that it is a mistake to criticize that with lack of tooling, as even the most cursory text editor supports syntax highlighting. Rob Pike is an outlier in his critic of syntax highlighting. But the regular syntax, through ease of parsing, is what precisely enables the level of tooling that Lisp has. And using that one could as as much differentiation through typography in the editor.

Comments are closed.