This old argument. I won't say a thing about Lisp. Let's talk about Ruby. For those of you who were using Ruby in 2005 when Rails was released. Are there any of these arguments that you haven't heard at least once in the last eight years about Ruby? That it's too powerful, that programmers make their own idiosyncratic worlds, that programs are too complicated to understand, &c. &c.
Yet, somehow, There is tons of working Ruby code in production and people have found a way to harness its power. People are awfully clever, I very much doubt that any serious language (i.e. a language not designed to confuse) can't be whipped into shape by a team of reasonable people who are motivated to succeed.
The "failure" of Lisp has a very simple explanation, so simple that people gloss over it. Every successful language began as the scripting language of some system that was itself exploding in popularity.
To be a hit, a language must arrive in the mass market's consciousness in tandem with its "platform." Objective C failed until the iPhone was a hit. C was a hit with Unix and then Windows. Ruby was a hit when Rails was a hit.
Lisp was the scripting language of Lisp machines. When the "AI Winter" hit Lisp machines, it hit Lisp. It never became the scripting language of something else that was exploding in popularity.
Today JavaScript is the scripting language of browsers, so it is a hit, and it doesn't matter whether its first-class functions are elegant (they are), whether its semantics have stupid corner cases (they do), or whether its inheritance model is obscure (prototypes!?).
Just show me something exploding in popularity, and I'll show you a successful programming language.
UPDATE:
I see folks are arguing about what colour to paint the phrase "Scripting Language." Must a scripting language be interpreted? Is Java interpreted? What about JIT techniques for JavaScript? When we transpile CoffeeScript to JavaScript, what is going on? I have a CPU that executes microcode. Is ASM for this CPU really an interpreted language?
All good stuff. By the way, Lisp was originally interpreted, compilers didn't come along until later. So... Was it a scripting language until it was "compiled?"
Have fun, but please remember to leave room in the shed for me to park my tandem.
COBOL, Fortran, C++, Ada... were the scripting languages of what exactly?
If I were staying in your argument, then Lisp was the language of symbolic AI. The language where things like symbolic mathematics (Macsyma, Reduce, ...), planners, theorem provers, natural language understanding, reasoning systems, expert systems, knowledge-based systems (Cyc, ...) were explored and first tried to commercialize (mid 80s). Symbolic AI funding went away with the cold war ending...
Lisp was actually invented and designed for this. John McCarthy was designing Lisp as a vehicle for (his) AI research.
There might still be a lot of Lisp users. Even among AI researchers (and related domains) there are many using Lisp tools which you probably have never heard of - like ACL2 (a theorem prover), PVS (a theorem prover used at NASA), ACT-R (a cognitive architecture with more publications written about then you can ever read). The usage of Lisp may even be constant in these domains. There was an explosion of programming in general. Outside these domains Lisp never really participated in this explosion (with a few exceptions like Lisp in Autocad, Lisp in Emacs, ...).
But should it? Lisp's almost unbounded flexibility is obviously not needed by the average programmer. It is actually a something which makes a bad programmer worse.
'Lisp Machines' were just one development/deployment platform - 10000 sold in a decade. Lisp was used before Lisp Machines and after Lisp Machines.
> It never became the scripting language of something else that was exploding in popularity.
Autocad sold millions of copies and is still widely used today. Several Autocad 'clones' support the use of Lisp for CAD scripting.
"COBOL, Fortran, C++, Ada... were the scripting languages of what exactly?"
Isn't Ada the scripting language of the U.S. Department of Defense? As far as I know, most real world Ada usage is in DoD funded projects.
C++ won the battle over Objective-C to become C's successor for many applications, so fills a similar ecological niche.
First sentence from History section of Fortran Wikipedia page:
"In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more practical alternative to assembly language for programming their IBM 704 mainframe computer."
So Fortran was the scripting language of the IBM 704.
COBOL's Wikipedia page indicates it was championed by a committee of all the major computer industry players of its day, making it the Java of its time. So COBOL and Java may be exceptions to this "scripting language of..." rule, and instances of the "successful language created by a committee" rule. (Common Lisp, of course, being an example of "unsuccessful languages created by a committee".)
I think you misunderstand scripting here. Scripting languages are interpreted, and can be quite a bit simpler to implement and code in than compiled languages.
The "scripting language of" argument is normally presented with the first example of C and UNIX - scripting is used here to describe the purpose it's being used for, not the nature of the language itself.
COBOL and Fortran exploded with the IBM Mainframe and the AS400 midrange computing platform. Both platforms still exist in modern versions under different names now (IBM has changed their names so many times it ridiculous). LOTS of legacy COBOL still has our financial and insurance markets working smoothly.
> LOTS of legacy COBOL still has our financial and insurance markets working smoothly.
Is there any proof for such a statement? Has anyone seen business applications written in COBOL running on any thing then maybe a few mainframes in a few US agencies/corporations that haven't been replaced simply because they aren't broken (yet!)?
I know there's tons of Fortran and some PL/I code that will never be translated in any other language anytime soon because it's just good the way it is, but COBOL? I'm starting to think that this whole "COBOL will still be around in 100 years" is an urban myth...
If you would spend 2 minutes to do a google search for businesses still using COBOL you'll find your answer. My statement is based on professional experience in the IBM Midrange field talking with people who still work on these systems. Use this link as a jumping off point http://www.guardian.co.uk/technology/2009/apr/09/cobol-inter... (not much has changed in this space in the last 4 years, this space moves extremely slow when it comes to change at the systems level)
If you use a charge/credit card, are an electric customer, had prescription filled, an insurance claim adjudicated, etc.…, your transaction was processed through some COBOL code.
Yes, corporate system overlords would like to put these systems to rest, but that a task easier said than done, with a great deal of business intelligence buried in the code.
I've always been curious about McCarthy's motivations and whether he was informed by Church's work on the lambda calculus. I heard he was schooled in functional analysis and symbolic differentiation and operator theory were what drove the invention of lisp, AI came later. I'm sure I could research it but I'm too lazy :)
McCarthy says in his 1960 paper about Lisp just in the first paragraph of the introduction:
> A programming system called LISP (for LISt Processor) has been developed for the IBM 704 computer by the Artificial Intelligence group at M.I.T. The system was designed to facilitate experiments with a proposed system called the Advice Taker, whereby a machine could be instructed to handle declarative as well as imperative sentences and could exhibit ``common sense'' in carrying out its instructions. The original proposal [1] for the Advice Taker was made in November 1958. The main requirement was a programming system for manipulating expressions representing formalized declarative and imperative sentences so that the Advice Taker system could make deductions.
So it was developed by an AI group for an AI software (Advice Taker).
well sure, McCarthy worked on that when at MIT, right? That AI group was also doing a computer algebra system, so symbolic differentiation was a current problem.
He is credited with the invention of Lisp though, and had a lot to do with Algol. I'm just curious when the connections between lisp and lambda calculus came about. Did it come later with the Scheme work at T at Yale or was it there also in the earlier work on Lisp.
I asked a former colleague about this, Jim Griesmer (who worked on getting the 704 lisp running on the 360 machines), and it was he who mentioned McCarthy's background in analysis.
EDIT: ha!, a little googling reveals[1] this story, not very definitive but interesting nonetheless. I should add that Jim Griesmer also told me that Minsky wrote his thesis on turing tapes or some such thing, because his advisors thought AI wasn't yet a developed field.
"To use functions as arguments, one needs a notation for functions, and it seemed natural to use the -notation of Church (1941). I didn't understand the rest of his book, so I wasn't tempted to try to implement his more general mechanism for defining functions. Church used higher order functionals instead of using conditional expressions. Conditional expressions are much more readily implemented on computers."
- History of Lisp: http://goo.gl/EDuhl
I've always liked the fact the he did so much with a book he didn't understand the whole of. :-)
I'm sympathetic to raganwald's overall argument, but "scripting language" is pretty meaningless if you call Java a scripting language, let alone others in this thread calling Ada and Fortran scripting languages.
I follow Erik Meijer's view on programming language success [1] - In real life, programming language success is based on the perceived crisis divided by the perceived pain of adoption. Successful languages such as JavaScript, C, Java, or Ruby introduced solutions to real problems, and they have relatively low perceived pain of adoption. On the other hand, Lisp has a high perceived pain of adoption and there's nothing specific that it solves for most users today, so there's little popularity.
An unrelated thought experiment on the importance of languages vs libraries: for most programming problems, would you be better off with an old language and today's libraries (e.g. Fortran 77 somehow using the .NET or Python libraries), or a modern language without modern libraries? I argue that you could solve real-world problems fairly efficiently with the former, but would be pretty stuck with the latter.
And an unrelated thought on languages and Turing completeness: has anyone considered "OS completeness" - how close you could come to writing an OS in the language? Obviously you need a bit of chip-specific assembly language, but C gets you very very close. Something like Java needs a larger assembly language layer [see jnode].
Javascript was the only cross-browser scripting language, that's what made it a success. It took almost a decade before Crockford turned up to say 'here's how you write javascript so it's not a complete pita'. Would javasript shrivel and die if all the browsers released Python support tomorrow? Probably not any more, but 6 years ago, most definitely.
Same thing with Ruby, without Rails would it be anywhere now?
So half your examples actually show there's something very different going on when trying to explain why languages become popular.
What languages have truly emerged without a leg up? Python? PHP as the server-side language? C# emerging ahead of VB.Net? Scheme?
I think Ruby's success is an example that the era of "the language of the platform" is gone: there was no OS or app to use Ruby as its scripting language, but there was more than one language to choose from for building a web framework, and DHH happened to choose Ruby. Same with Python in the scientific computing field. Or (unfortunately) PHP in the web world.
We have multi-languages VMs (JVM ostensively, but CLR too...) and multi-platform languages galore. The era of the platform language is gone, and it's a free for all language war, where features and libraries are all it matters, and I'm sure Lisps will win more ground in one form or another, though something like Scala will probably "rule the world" :)
I like that explanation. It aligns well with my belief that language syntax/features/etc really don't matter that much in the grand scheme of things.
It's unfortunate because I know people that have not chosen Lisp as the scripting language because they want their product to explode in popularity, and they fear Lisp would actually hinder that.
All that said, with Clojure I think it's obvious Lisp hasn't really failed at all. Maybe it's better as a language just outside of the mainstream.
Oh I completely agree; I'm glad you said that so others don't think it's true. Some people do think they care though, especially for tools targeted at developers.
I knew a game company in Montreal that was building a game development framework in Gambit Scheme. It was awesome, and their designers found the s-expression configuration format incredibly intuitive.
The core of the argument has nothing to do with popularity. You're right about everything you say (if only you glossed over notable exceptions like Java or C++), but we're discussing "power", not popularity.
Unfortunately, power has a very positive connotation. As a programmer, wouldn't you want to use a more "powerful" tool? However it's different when it comes to programming languages. Too much power can hurt you. Realising that there can be such a thing as "too much power" is new to me or at the very least, it's counter-intuitif enough to deserve being debated. That's the point of, or at least that what I take away from, this page.
Arguably, the success of Java comes from its complete lack of Lisp-y power. Python may be "powerful", it makes a big deal to enforce the "one way to do it" culture and has always had a love/hate relationship with functional features for this exact reason. Speaking from experience, I find it easier to read and understand Java or Python than I do Lisp or Lua (another "powerful" language where every project will inevitably implement its own object model, basically rebuilding Python or Ruby).
So yes, I believe that there's a spectrum of "power" between programming languages, and more "power" isn't necessarily better. It's a characteristic, nothing more.
But more importantly, I wish we could start discussing and comparing programming languages without talking about something as meaningless and volatile as popularity. I don't care if my favorite language becomes the most popular. I don't want to unify all programming languages into one. I am happy when different languages solve the same problems differently, more or less elegantly, and I have so much fun learning all these approaches. I don't care what "succeeded" and what "worked".
I sincerely mean no offense, but I'm tired of having people cut the debate short by turning any meaningful comparison into a popularity contest.
Ok, I'll bite. How would we go about empirically measuring whether too much power" is harmful?
Do we give 100 teams a task to accomplish in Lisp, but tell 50 of them that they can't write their own macros? Or tell 50 of them that they can't use continuations?
If continuations correlated negatively with success, how would we know whether continuations are "too powerful," or merely "poorly designed?" (There is a lot of talk these days about whether continuations are a "Turing Tar Pit" and that a better mechanism needs to be invented)
Or perhaps our 100 programmers are merely unfamiliar with them, and as a result don't use the features well?
My feeling when I consider thought experiments like this is that it is very difficult to define expressions like "too powerful" with precision. This imprecision leads to the effect where we twist our explanations to suit ur biases.
So if we think Lisp is "too powerful" and that it failed in some sense, the lack of precision around "power" and "failure" allow us to draw whatever conclusion we like about the relationship between the terms.
You make very good points. It's obviously very difficult to quantify the differences between programming languages. And yes, I agree, it leads to imprecision which in turn leads to biased results.
I still think it shouldn't stop us from doing the exercise anyway. We might draw "whatever conclusion we like about" it, but it's still a way to move forward. The wiki page in the original link is not trying to convince the reader of anything (or at least the first few posts). It's a user sharing his or her insight on his experience with Lisp. It resonated with mine. Reading this helps me put words on something I've been feeling for a while. We might both be wrong, but at least it helps me in some way.
What I'm trying to say is that languages may be equivalent (Turing!), they're not exactly equal. Our inability to measure the differences with precision should not prevent us from comparing experiences. I am not interested in adopting the manager's point of view of "which language should be best for my project/team?"; I'd rather take the young pupil's point of view of "let's discover how these guys do it". After all, I'm a younger (in programming years) than many people around here.
Your original comment seemed to imply that a language's worth is tied to the popularity of the platform running it. I personally find this aspect the least interesting in programming languages. Maybe because I'm not an entrepreneur and I don't necessarily feel like one.
Take languages like Ada. They were designed so that individual code pieces are understandable from reading them without further external context. The DoD had requirements for this.
On the opposite side is something like Lisp. You look at a simple expression inside some piece of code. Basically it can do arbitrary things you don't see from looking at the expression. You need to understand static and dynamic contexts. You can completely alter the meaning of every single expression on many levels (from macros to the Meta Object Protocol). In the Java world you would need a preprocessor for that. Lisp has this built in. Thus one can program in a language in a language in a language. Debugging and program maintenance gets a nightmare without total self discipline.
One can develop on a level without much of the self-modification - but there actually exists quite a lot software which makes use of advanced Lisp features.
On interesting argument in favour of Haskell, is exactly that it takes the right powers away from you. The powers that allow parts of your programme to screw with other parts: side-effects.
How should we measure "power", though? I mean, Turing completeness is Turing completeness. You can do in one what you can do in another. So is power conciseness, then? Is it clarity? Is it maintainability? All/None of the above?
One possibility is to use Felleisen's notion of expressiveness: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.51.4.... The gist is that Turing completeness concerns what you can do with whole programs, but expressiveness concerns what you can do in smaller scopes.
A scripting language is a language to extend applications. These applications tend to be written in languages like C, Pascal (think Delphi), C++, and others. Like a CAD system is written C or C++ with an extension language added on top. The interpretation is only an implementation detail. For example PTC's modeller is written in C++ with several million lines of Common Lisp on top.
Lisp compilers exist from day 2 in its history. The first self compiling Lisp compiler was written in 1962, IIRC. Maclisp was mostly compiled, Lisp Machine Lisp was compiled, Common Lisp was designed for compilation from day one.
You're conflating "success" with "popularity". Lisps weren't invented to become "popular", they were invented to get something done in a way that other languages couldn't easily do at the time. One only "fails" if one cannot accomplish its purpose. Different languages have different purposes, so "success" as a measure to compare them is inane... unless the purpose of all programming languages should be to be the most popular programming language?
I'm speaking to the wiki page. If this is a discussion about whether Lisp was too powerful to successfully implement Expert Systems, or Yahoo! Stores, or Reddit, well then, I agree that's a different conversation.
But this is what I thought the page was speaking to.
There are a lot of snobs in the Lisp community. They can't stand the thought of Lisp going mainstream, because their language is too beautiful and perfect. Every language has them, but Lisp seems to have the highest ratio of snob/regular person. I love the language, but the community is not as good as others.
edit:
I'm getting downvoted by the snobs themselves. Hilarious.
This may be true of the Common Lisp community but the Clojure community seems to be very welcoming and I think one of the more exciting communities around today.
As the famous "Brief, Incomplete, and Mostly Wrong History of Programming Languages" [1] essay puts it, Lisp's key techniques are "recursion and condescension".
As e certified smug lisper(you get to be one when another smug lisper calls you one :), I would say that there are many of us who just don't care whether or not lisp gets more popular, but I don't think I've ever come across anyone who would intentionally drive away new users(except for obvious trolls, who come to #lisp and ask what it is used for, and get angry when we answer: "Writing software", one even called us cts :).
I don't really mind more popularity, but my definition of a mainstream language is "The non-technical boss prefers it, because of reasons completely unrelated to its technical qualities.", in which case I agree with the snobs, we don't need that at all.
But if by mainstream you mean "It moves up from its current position as 33th most popular language on github", sure, why not. If by mainstream you mean "more companies start using it for useful software that makes money": http://this-plt-life.tumblr.com/post/36425247242/when-i-hear...
I consider Common Lisp to be a tremendous success, some of the coolest software was written or prototyped in it, including many language implementations. Not to mention what a tremendous achievement it is for a modern lisp system to be able to run 4 decades old code with little or no modification. Being mainstream was probably never a design goal, but the things listed above are, and that is probably why many lispers simply don't care about being mainstream, the language is not optimized for that at all. I don't thing that makes lispers snobs, they've just self-selected for people who can work with less community support. It's like a low-econ game of starcraft :)
But I'm not sure this argument applies in the 21st century. Languages like Clojure and even Rust get significant attention just for being better, and they are not tied to any must-have tool.
> Languages like Clojure and even Rust get significant attention just for being better
They are not getting any attention beyond Hacker News and reddit. They are extremely marginal programming languages and they both have a very high likeliness to be completely gone and forgotten in the next couple of years.
I wasn't clear - the point is that in the 21st century, you are working in a multi-language paradigm no matter what, tying together disparate services with neutral protocols like HTTP. Nobody knows or cares what language a server is running. So weirder languages are at least worthy of investigation.
EDIT: Also, pg's "Beating the Averages" essay, arguing that non-mainstream language choice can be the foundation of a startup's success, is more or less the foundational text around here.
It's more complicated than that. Languages don't become a hit in a night. It's because the authors carefully chose the features in a way that makes the language pleasant to use or that it's particularly adapted to solve a set of problems. It's a complex ecosystem that also includes package managers, editor support, ...
When you read ruby you won't have 10 different implementations of conditionals. They're implemented with keywords and the language doesn't have macros so unless you want to add more indirections you have to take Matz's decision. There is also a common understanding of what is idiomatic ruby and not.
I'm not sure it's the same with Lisp. It seems way to easy to come up with your own smart constructs. It seems to me that this flexibility is what makes lisp hard to settle as a language. Everyone gets their own say on small details that prevent the bigger things to emerge.
JavaScript's success is particularly freak-ish. In the early days of research into prototype-based languages, I don't think anyone imagined that a language with a model so obscure and research-y would become the ubiquitous programming language in the world.
I like how you put that. Explodingly popular languages owe their success to tandem platforms. It makes much more sense than other explanations.
I'd actually argue that Javascript's object model, given its current usages, is close to ideal - but not perfect. It would be better if the language provided a set of kernel functionalities for you to build whatever higher object model you might need. I have in mind something akin to this: http://www.vpri.org/pdf/tr2006003a_objmod.pdf
Foe example, ES6 is supposed to bring us some meta-stuff, but that wouldn't be necessary if the core model were more general.
Thank you for your very clear, succinct explanation of how computer language popularity works. There may be exceptions, but they are pretty few and far between.
I think this post should be bookmarked and referenced every time an argument is made using a computer language's "popularity" as a proxy for the quality of that language.
I consider Python to be the scripting language of "everything else". In the apps I use frequently, Python scripting support shows up in Blender, GEdit, and Linux Mint.
The original post is right: Lisp by itself is an incredibly powerful yet incredibly basic language. It's a functional Assembly for declarative DSLs. The most talked about Lisp nowadays is arguably Clojure, and this is probably so because Clojure provides much more built-ins and libraries out-of-the-box than other Lisps, where you'd have to build these things yourself. At the end of the day, we just want to call "solveMyProblems()", commit that 1 function call, and move on to our next project. The lower-level a language is, the further away from that goal it is, and the less reasons there are to use it.
I suspect LISP is too unstructured for its power. It's very hard to scale something to more than one programmer or to a large size in a completely free-form dynamic language. It's easy to blow off other peoples' feet, and totally dynamic languages with many contributors to a project often yield an unmaintainable rats' nest of mixed metaphors and hacks.
Java is one of the most successful languages for massive projects because it imposes a set of common idioms and semantics. Look at Eclipse... an absolute monster but a living, breathing, and pretty darn successful one.
I can't imagine Eclipse in Ruby or JavaScript, let alone LISP... not with the number of contributors it has. It would melt into a puddle of slag not unlike the "corium" that forms at the bottom of a melted-down nuclear reactor.
For those dismissing this as FUD and ignorance: LISP, despite its power, isn't really used for much. I think there has to be a reason for that, and I don't think "everyone is dumber than me" is a valid reason. There are tons of outstanding programmers, and the majority of them do not choose to code real-world stuff in LISP.
I suspect Ruby, which is fairly powerful but not as freeform or succinct as LISP, is more successful because it has more structure around its power. It seems to have enough structure to make large long-lived code-bases possible in the "real world." (But like I said I'm still skeptical that you could write Eclipse in Ruby.)
The question is: would it be possible to build enough structure or tooling around LISP to fix this problem? Could something like Light Table with deeper refactoring and visualization tools deliver a workable LISP environment for modern large applications? I know there were things done in the past during the "80s AI era" that might bear re-examination and resurrection.
You say it like it is essential that we scale to lots of programmers up front.
The problem is that there is an organizational barrier. It is easy to have teams of under 8 programmers. It is doable to scale teams of 20+ programmers. But if you're between those numbers, small team dynamics fall apart horribly, and large team organization leaves you with less productivity than the small team.
For a small company, the cost jump from 8 to 20 programmers is pretty significant. Therefore it is worth considering the strategy of getting the most out of your existing programmers and not trying to cross that chasm. If you succeed, you eventually will have to face the scaling problem. But success tends to take care of itself.
I think GP meant the number of people touching the software over its lifetime, not necessary at the same time in one location like in a company. e.g. Eclipse must have hundreds if not thousands of contributors, loosely networked and hardly seen each other, contributing over a long period of time. Yet they are able to understand each other's code and build on top of each other. The readability of a language is a must to foster this kind of cooperation. Java is a very readable language and the knowledge in the code can be easier shared among developers.
>(But like I said I'm still skeptical that you could write Eclipse in Ruby.)
Why would anyone ever want to write Eclipse in Ruby? I have no doubt that such a feat is possible, but... why? If you're using Ruby, a monstrous IDE like Eclipse is unnecessary. If Java didn't exist, there would be no reason for Eclipse to exist, either.
There's a feedback loop. You're pointing out that Java is popular because of massive software projects, which is true, but the converse is also true--massive software projects are popular because of Java.
Eclipse was just an example. There is going to be a niche for massive projects. There's also going to be a niche for projects that live for a very, very long time and thus are touched by a massive number of programmers... which presents many of the same problems as a massive project.
Code reuse is also problematic without structure. C++ has this problem to some extent, which is why the C++ library ecosystem is so fragmented. I typically avoid using most C++ libraries, preferring C++ as high-level glue with C underneath. STL is used because it's standardized, and maybe boost, but that's it. Every C++ library has a different style, so if you use them your code ends up looking like crap... mixed metaphors everywhere. C++ isn't powerful in the same way as LISP, but it is pretty darn powerful in its own way. It lets you do things umpty-different ways, which is bad sometimes.
Common styles and idioms are very useful when you want to connect things to each other.
Ruby and NodeJS have solved this problem by having a certain amount of community standardization around idioms-- convention over configuration basically. (Or in this case convention over language restriction.) Perhaps that's one thing that could help LISP-- an attitude of "yeah, you can do it that way, but that's not standard so nobody will use your code if you do." I know it would really help the C++ ecosystem.
Eclipse was just an example. There is going to be a niche for massive projects.
That niche and Java deserve each other.
There's also going to be a niche for projects that live for a very, very long time and thus are touched by a massive number of programmers
There's some overlap, but more than anything, this niche is just the one that really needs a solid unit test suite and documentation. There are excellent unit test framework packages for Ruby, and high unit test coverage is very much a part of Ruby programming culture. The documentation quality for open source Ruby projects also seems very high overall.
Community style/idiom standardization in Ruby and Node is probably a function of having a constellation of very active, popular, and often interdependent open source projects in each. When you're writing Rails gems that depend on other Rails gems, everyone is reading everyone's code, so agreeing on some style guidelines as a community makes everyone's life much easier.
I think that if someone wrote a very popular open source project (analogous to Rails) in a Lisp dialect, we might see the same thing start to evolve. And I think we're already starting to see it happening with Clojure. There's a Clojure Style Guide that's gone up on Github very recently: https://github.com/bbatsov/clojure-style-guide
Every language has flaws and using languages effectively means knowing what those flaws are and how they affect your program's processes. Does Lisp code produce unmaintainable masses of code? Perhaps. Ever read the IOCCC entries? Or pull apart exactly what a given Ruby script is actually doing? Yet people write good software using these languages all of the time and the world hasn't blown up yet.
Not really, it's just C2 being C2. You can often see a lot of conflicting opinions on those pages, but the arguments tend to be solid. Try reading these two:
Lisp is perhaps the most powerful language there is. But no language is 'too powerful' considering the software that is being built today. It has not enjoyed commercial success because it gained reputation as an inefficient language and the average programmer cannot become proficient in it unless he has a certain amount of discipline and learning he is prepared to invest. C and C++ are much easier to learn and are taught more in colleges. So the reason lisp isn't popular is because people are not learning it. And even those who learn it have less opportunities to use it because of the C, C++ and Java legacy.
It seems that you are cherry-picking. Lots of "inefficient" languages that were never taught in schools have gained incredible ground -- consider python, ruby, perl, and PHP.
On the contrary, all of those are being taught in schools python is now being taught at introductory classes in MIT, Cal and Stanford to name some. Ruby and PHP are taught too. At lest in grad school. Don't forget the private turoring and certification programs you can enroll in. I have not seen a single certification course that involves Lisp. Also, consider the ages of those languages. Lisp has been around for more than 50 years now. The others are not even half its age.
Lisp is the most powerful language in the world - as witnessed by its power to cloud the minds of otherwise intelligent and articulate engineers and scientists.
It sounds to me like arbitrarily taking a word with good connotations to mean something that is convenient for promoting what you're interested in; regardless of the literal meaning.
"But I have yet to see a good demonstration that Lisp significantly simplifies things beyond what other languages can do... It only slightly simplifies things."
Lisp is not a problem. Lisp is fine. So are Scala, Ocaml, and Haskell, under most use patterns. Python's not perfect, but I'll gladly use that. It works. Actually, I like Clojure better than CL, but that's irrelevant. I had one experience of being burned by "Lisp's power". I was in a company where I had to work with a programmer who used Lisp in ways the rest of us found to be irresponsible (poor documentation).
He actually wasn't a bad programmer. He was actually a pretty good one, but management had been riding him with unreasonable feature requests and ridiculous deadlines, so no one ever got to do anything right in that company. My code was better, not because I'm inherently better but because I fought back and simply padded the shit out of estimates so I knew I'd have the time to do things right.
These problems that get blamed on languages are usually the result of terrible management.
Even bad code is not pure evil. Bad code is just logic, poorly fashioned and presented, often because of shitty business practices rather than poor programmers. The evil is the manager who forces you to use bad code (instead of letting you replace it) because "we can't afford" doing things right, even though you're in a rich company that pays managers extremely well. That's the enemy. Let's stop warring about cosmetic differences among good languages and focus on that guy.
Lisp is too powerful only for unexperienced developers because the way of programming in Lisp is totally different from all other languages.
Actually many modern languages (Python, Ruby, Java, even C++11) copy more and more features from Lisp since the language designers suddenly realized how useful these features are.
Lisp was never "too powerful" but simply far ahead.
Lisp makes it really easy to build infrastructure and multiplier-level (Level 2) contributions that affect the whole shop. When you have 1.3-1.4 programmers (scale here: http://michaelochurch.wordpress.com/2012/01/26/the-trajector... ) who aren't ready to take on projects that will affect other people in such a major way, you can get bad results. That's a management problem, though. Inexperienced developers need to have opportunities to experiment, but if others are forced to sit downwind of their work, then management is behaving badly.
I think some comments (and responses to the comments) are merged with the original writing. But this format, and the impossibility to distinguish comments, make it hard to understand.
Yet, somehow, There is tons of working Ruby code in production and people have found a way to harness its power. People are awfully clever, I very much doubt that any serious language (i.e. a language not designed to confuse) can't be whipped into shape by a team of reasonable people who are motivated to succeed.
The "failure" of Lisp has a very simple explanation, so simple that people gloss over it. Every successful language began as the scripting language of some system that was itself exploding in popularity.
To be a hit, a language must arrive in the mass market's consciousness in tandem with its "platform." Objective C failed until the iPhone was a hit. C was a hit with Unix and then Windows. Ruby was a hit when Rails was a hit.
Lisp was the scripting language of Lisp machines. When the "AI Winter" hit Lisp machines, it hit Lisp. It never became the scripting language of something else that was exploding in popularity.
Today JavaScript is the scripting language of browsers, so it is a hit, and it doesn't matter whether its first-class functions are elegant (they are), whether its semantics have stupid corner cases (they do), or whether its inheritance model is obscure (prototypes!?).
Just show me something exploding in popularity, and I'll show you a successful programming language.
UPDATE:
I see folks are arguing about what colour to paint the phrase "Scripting Language." Must a scripting language be interpreted? Is Java interpreted? What about JIT techniques for JavaScript? When we transpile CoffeeScript to JavaScript, what is going on? I have a CPU that executes microcode. Is ASM for this CPU really an interpreted language?
All good stuff. By the way, Lisp was originally interpreted, compilers didn't come along until later. So... Was it a scripting language until it was "compiled?"
Have fun, but please remember to leave room in the shed for me to park my tandem.