I think (shameless plug) that my FSet functional collections library helps modernize CL quite a bit more than this does. I've had a couple of people tell me that FSet has changed the way they program. That's a high compliment.
I'd be the first to admit that FSet takes some getting used to, but if you're willing to put in the work to learn to think this way, there are substantial benefits. It greatly expands one's opportunities to write CL code in a functional style. (Those already familiar with the functional style will find it fairly natural.)
One of the annoyances of CL is the absolutely nonsense function names. Just looking at these examples, I have no idea what princ (something to do with print, i assume), getf, or elt (element?) mean.
What does tar mean? ls? df? gunzip (something with guns?)?
Every language old enough will assemble some naming problems. If you look at newer languages, even there naming is still the old problem. Clojure: What does fnext do different than nnext? Can you guess what rseq does?
With larger software systems, you'll see that you will need to look up documentation quite often. On my old Lisp Machine there are 60000+ symbols naming tens of thousands functions. So naming becomes important. When Lisp was small, there were functions which print something, then variations were added and people tried to keep the names short. After a while there were so many functions and memory was larger. People started to use longer names. CALL-WITH-CURRENT-CONTINUATION (in Scheme) or SET-DISPATCH-MACRO-CHARACTER were the results. ;-) Now we got nice descriptive names, but we need completion during input.
There is no way other than to learn a certain base vocabulary. Since the printer is essential in Lisp, you'll need to learn functions like princ or print to read older code. In your own code you can just use WRITE, which is a general interface into the printer.
Fortunately enough, the documentation of these functions is just a keypress away in any good Lisp implementation.
>What does tar mean? ls? df? gunzip (something with guns?)?
How are MORE bad examples an argument against what he says? If anything they reinforce his statement.
Plus, the thing with "tar" and "ls" is that they've been tar and ls in all Unices for decades. If you learn "ls", and maybe "dir" if you want to use Windows, you're golden. Whereas every language seems to use its own names for "princ", "setf" etc, both before and after CL.
By that logic, all the other languages are wrong, since CL predates them by decades. Getf has been the same in CL since, what, 1952 (much older than Unix).
I find that mildly annoying as well, though since I came to Lisp after Unix/C, it didn't make a huge impression on me. It's possibly more grating in CL just because it doesn't fit the rest of the aesthetic: in user-written code it's idiomatic to write functions with names like sort-database, not like srtdb. While with Unix + C you don't really expect utilities or functions to have anything but cryptic names. I think some of it may relate to Lisp coalescing over many years: some of those old cryptically named functions were named in the 1950s or 1960s, when every language of the time, from ALGOL to FORTRAN to LISP (not to mention assemblers), used identifiers like that— in all caps, of course.
I assume that historically it's because they were trying to keep function names short to help save memory, don't forget Lisp in one form or another has been around for a loooong time.
A lot of the time there are more descriptive alternatives: car -> first, cdr -> rest, elt -> nth, etc... However, these are also complained about by a lot of people as unnecessarily bloating the language! (plus, elt/nth manage to invert their argument order, doh...)
So it's kind of a no-win situation really. But I like this attempt to provide a common substrate and hope that it will succeed where others have failed.
In your own code, you can use FIRST and REST for lists.
But if you want to program you need to deal with that. Stuff was there before you and has a history. Changing things has a benefit, but also a cost. For a small community like Lisp, constantly rewriting code because some names change is not such a good idea.
Given that we can remember thousands of words in natural languages, a few hundred core words of a programming language is not such a huge hurdle.
Another issue is old books. I recently got value out of PAIP and On Lisp, and I still see people recommend A Gentle Introduction to Symbolic Computation. Stepping beyond CL for a second, car and cdr have even deeper roots; for example if someone is introduced to the wider lisp-like world through SICP they will also have to become comfortable with car and cdr.
That car and cdr can be aliased easily by any Lisp programmer might lead a person to wonder why they survive and are even included in more modern Lisps such as Racket.
The reason is the expressive power of their extended versions - e.g. caadr or cdaar don't have easily derived equivalents from first and rest and those that can be derived are at best equally bad or worse...ffirest and reffst anyone?
Nothing and Common Lisp includes second. But that's not caar. It returns the first element of the first element of a list - the extended forms of car and cdr are for nested lists and nested lists can be used to implement many different data structures [and are also the data structure containing Lisp programs].
The extended forms of car and cdr provide a form of expression which is not obvious based on the construction of common non-Lisp languages even those that have very flexible lists and dynamic typing.
The problem is, it really is "rest" instead of "second", at least in the usual case. Yes, a cons cell can contain pretty much any two things, but the most-frequently-used case (or so I believe) is that of a list. In that case, "car" means "first element of the list", and "cdr" means "the entire rest of the list", not "the second element of the list".
Personally I have found it useful to keep in mind that it's the pair that's fundamental and not the list. IIRC it was pg that made the point somewhere that car and cdr are acceptable because there really aren't any slam-dunk general terms for the parts of a pair.
True, and if you're thinking of them as a pair, then "first" and "second" are appropriate. But if you're using them to implement a list, then "second" is misleading.
The thing to realize about Common Lisp is that it's not built to accommodate new users in the manner that Scheme was. Common Lisp is entirely intended to be a language for working professional programmers. Implicit in its specification is the idea that any feature that a programmer does not like will be changed, Renaming functions is largely a trivial exercise.
Although this project renames functions, that isn't all that it does. Accomodating new users is very necessary for adoption of any language, library or project, especially by having reasonable defaults. All working professional programmers were once beginners. Of course, it's another argument that this project succeeds at any of these.
I wasn't criticizing the project, just responding to comments.
And I appear to have said "new users" when I really meant "beginners".
Experienced programmers as new users will stick if the language is appropriately expressive. Beginners will stick for different reasons and the only people who really seem to have a handle on beginner programmers from a professional standpoint is the PLT group of Racketeers. Pretty much everything else I see about beginning programmers is based on anecdote and personal opinion not the quality of data Felleissen has collected.
In all seriousness, I hope you're not suggesting Common Lisp, never mind a non-standard version of it, as a beginning programming language in the typical case when much more suitable Lisps are available and under active development.
As a beginning language Scheme and especially Racket are better designed languages. There's nothing of significance in Common Lisp that's better for a beginner and much in Scheme and Racket that is.
If I was going to move away from Scheme and Racket for vocational reasons but stay within the Lisp family I'd give strong consideration to eLisp since it touches on toolmaking. Clojure would be a second choice because it touches on imperative programming and library use.
I learned Scheme first. I still find that experience great.
But I moved then to Common Lisp and think that learning Common Lisp from the beginning is much better. Common Lisp is much better suited to write software than Scheme or even Racket. A good Common Lisp implementation has much better tools.
At the University we had a site license of Allegro CL from Franz, Inc.. Every student had access to it via the SUN cluster. That was a revelation to me. These tools were so much better to use and for learning.
Common Lisp is better than Scheme or Racket in the same sorts of ways that Clojure is better. It's a full on professional tool. But it doesn't have Felleisen's Student Languages or Htdp/universe out of the box to facilitate teaching like Racket.
I've never explored Franz/Allegro because CCL and SBCL etc. carry less baggage because of their FOSS pedigree...it's a bias more against demo/evaluation versions than closed source.
I doubt that either Racket or Clojure are comparable. Racket is not professional tool - it's an educational tool. Clojure is a more or less thin layer over Java.
Yeah, you're right. A lot of people are turned off by it and a lot of the old-guard just say "deal with it!!" After getting used to a lot of the symbols (`car`, `cadr`, etc) you kind of stop being too annoyed and don't even notice them anymore. It's like learning another language, there's not always a 1:1 mapping of words. That said, it's much better to recognize a problem and try to fix it that say "It has worked fine for 50 years!! Leave it!"
That definitely seems to be one of the goals of this project: to give CL a face that makes a bit more sense.
At least with CL there's a certain logic to it. getf => get form, setf => set form, setq => set quote, etcetera. All languages are guilty of this—I used to be bothered by Python's len() and Ruby's .uniq() and so on and so forth, but you get used to it after a while. Except for PHP. You never get used to PHP...
The functions are basically inconsistent in both name and argument order. I think that's one of the problems they're working on correcting (a generic elt/getf), and more obvious ways of creating and using non-list data structures.
I think the hash table syntax is a huge step in the right direction. Almost every modernly-used language ever has hash table syntax. In lisp, I have to do
I think this comes down to some historical baggage around the original idea that people would just use association lists ("alists") by default, while hashtables would be an advanced feature used mainly in the optimization phase, when profiling indicated that alist lookups were a bottleneck. The alist literal is a lot friendlier:
'((name . "andrew") (location . "sf"))
(Incidentally, you probably don't really want 'name and 'location to be strings.)
It's curious that no quasi-standard reader macro for hash literals developed, though.
I think another factor is that while the primitives for modifying readtables made it into the standard, an interface for using them in a way that limits the changes to the code you own (and not, say, additional libraries you load) is something that the users had to come up with.
It's not a lot of code or too complex to do that, but it's not obvious either and some people got it wrong or do it differently and I think that made reader modifications less common than they otherwise might have been.
I haven't looked too closely at the details but some of cl21's changes look like they might try to address this issue.
One could make the argument (I think so, at least) that if the number of associations you are dealing with is small enough to write out literally, then using a hash table is a mistake. It might be one of Common Lisp's strengths that is has ways to represent small tables efficiently and thus avoids the "hash addiction" that haunts languages where hash tables are used for everything.
But even if a more compact way of constructing hash tables is called for, why a literal syntax instead of a more compact constructor? E.g. you could write
(dict "name" "andrew" "location" "sf")
with a compiler macro that produces exactly the above code.
Which seems, to me, easier than having more special-case syntax to learn.
But I suspect this may be one of the drawbacks of Lisp, (indeed, it may turn out to be a problem with any powerful programming language.) It's so absurdly easy to do things like that, that incremental advances can be retarded by virtue of the steps along the way not being shared. And then when someone has to come along and use what you've written, without having been there for the steps along the way, the base level of abstraction they have to build up from is too low.
Yep! It's necessary to modify slime (or light table? I've been interested in making a CL plugin for it) to take advantage of this syntax, but the absense of sane reader macros in the spec has always driven me nuts.
I've always thought that CL needed a set of common libraries to be strongly recommended. The problem is that it has been tried before, but never really taken off.
However, given the recent rise of quicklisp for managing libraries, it's entirely possible that a project of this type will be much more successful. I hope so!
I still remember the first time I looked at the Alexandria library and realised I'd already implemented a good 1/4 of the functionality myself, just because it was glaringly missing from the CL core spec.
> I still remember the first time I looked at the Alexandria library and realised I'd already implemented a good 1/4 of the functionality myself
Yeah, same here. I have a bunch of utilities lying around from when I started that Alexandria would have completely obliterated.
I'm interested in cl21's use of symbol partitioning. I've always thought the `cl:` package was really, really bloated and somewhat confusing to newcomers. Something that divides everything into "this is for math" "this is for primitive data types" etc would make things a bit easier to manage.
I know /r/lisp didn't really like it for the most part, but since the spec is frozen it's nice to see people who are involved in lisp actively trying to make it better. The world is littered with successful projects that started out with people saying "You're wrong, don't bother doing this." I wish Mr. Fukamachi well and look forward to progress.
It's general purpose. It's a high-level language that has OS-level threading, can be functional/imperative, compiles to machine code, has powerful macros which completely cut down code repetition and allow syntax expansion, can call out to C without compiling wrappers. About the only thing it's missing are coroutines, and even those can be mimicked by macros to some extent. Really, a better question is what can't lisp do. It's probably not the best choice for an embedded device or something like that, or anywhere you need tight control over memory/resources. I'd say it's an extremely decent complement to C in a lot of respects. If you can get passed some of the oddly-named symbols, it's a great language that has made leaps and bounds in the past 10 years as far as third-party libraries and implementation features.
Well, it's amazing for rapid prototyping. The same could be said for Ruby/Perl etc but the addition of the REPL, CLOS, Macros, etc make it slightly better. As long as the libraries exist for CL upon which to build, which can be a sticking point.
To give the more canonical answer, though... CL is the programmable programming language, so it's not suited for a task per se; rather you lay down a base that's suited to your problem domain, and then you solve your problem in your newly-created "language". It's hard to explain but true if you get it right.
Damn right, I love how clojure hosted semantics import all the great features of their host. Like how clojurescript has only floating point numbers and coerces string to numbers when using the + operator.
In all seriousness, I mad love to Clojure, but is a different language with different sensibilities/tradeoff. Not a 'better' CL. For example, I imagine that interfacing with c code using cffi should be way easier and simpler than using JNI and then exposing that to clojure. Or having defined semantics for numerical computations.
http://www.reddit.com/r/lisp/comments/1vtueu/cl21_common_lis...