The early LISP crowd just didn't get the concept of making a product. The idea that eventually you generate an executable and leave the development environment behind was totally alien. I used Berkeley's Franz LISP in my years at the aerospace company. That was a compiler which generated .o files. So they were close to being able to do this.
But you didn't link the .o files; you had to load them into the development environment to run them. Compiling was just an optimization. I asked the Franz LISP devs why they didn't provide a runtime you could just link in, yielding a releaseable executable program without all the baggage for breaking and debugging. This was a totally alien concept to them.
INTERLISP and Symbolics were even worse - you were always in the development environment and couldn't get anything out except a saved state dump.
Eventually the LISP crowd got it and started generating executables, but it was too late by then.
> ... Symbolics were even worse - you were always in the development environment and couldn't get anything out except a saved state dump
Symbolics had a delivery Lisp on DOS/Windows called CLOE.
On a Lisp Machine one also was not always in the full development environment. But Lisp was the operating system and thus Lisp plus some dev tools was always there. But when one wanted to use the development environment to its full capability one would need to load more stuff and create worlds with more debug information.
The delivery of Lisp applications wasn't a thing until after the mid 80s. Before that there were not that many Lisp applications and not too much machines to install it on. This changed with UNIX workstations, 386/486 PCs, Macs with 68030/40. Even in the early 90s there were questions how to deliver Lisp software in static form to end users. The question was often answered with "rewrite it in C++". Sometimes: prototype it in Lisp and then rewrite it in C++. Sometimes: migrate to C++. Sometimes: use a Lisp compiler generating C code.
For some discussion of delivery problems in Lisp see the German Apply project from 1992:
Don't forget dedicated "delivery" 36xx series machines from Symbolics, which were lower priced specifically because they weren't supposed to be used with the development environment.
Minima was, AFAIK, special real-time variant of the Genera system, with noticeable differences - it also ran a different firmware at least on Ivory (differences in FEP<->OS interface from what I figured spelunking in code).
Minima was also used for some debug tooling for building the machines themselves, iirc.
I used Berkeley's Franz LISP ...That was a compiler which generated .o files.
... But you didn't link the .o files; you had to load them into the development environment to run them. Compiling was just an optimization.
That sounds the same as Python and Java work now. For python, you have .py source code files or .pyc or .pyo compiled files, but you still have to load
them into the Python interpreter.
I asked the Franz LISP devs why they didn't provide a runtime you could just link in, yielding a releaseable executable program without all the baggage for breaking and debugging.
Even now, you typically need to have Python or Java installed on your computer to run Python or Java programs. There are ways to generate standalone Python executables but it's not the usual way of working.
I'm not asking for it as such, but the claim that Lisps now produce "executables" is sometimes still not quite what people have in mind. It's still a whole bundled Lisp runtime, not unlike using PyInstaller to bundle an entire Python runtime in an executable.
Most languages need some runtime. In the case of Lisp it starts with the basic memory management, since Lisp by default needs a garbage collector. This memory management is usually not provided by the operating system. It needs a basic interface to some OS stuff, threads, etc.
A Lisp executable then can be one file which includes a runtime and the (compiled) Lisp code. One could also have these as two separate files.
The there are two basic ways to provide the Lisp code: it's either just compiled code or a memory image of containing the code. SBCL uses the memory image approach. ECL (Embeddable Common Lisp) uses the 'compiled code' approach. ECL compiles to C and then adds the compiled C code to its runtime.
In the 80's and 90's Common Lisp compilers and editors were insanely expensive. The hardware was too, not even sure if those early implementations ran on x86... That's why they didn't take off. Easy to look back with rose coloured lenses now that we've had free implementations for awhile...
Meanwhile C/C++, Pascal and Java were all accessible and cheap if not free...
What impacted Smalltalk, was its biggest backer (IBM), deciding to pivot into Java, porting their flagship IDE (Visual Age) into Eclipse, dropping OS/2 (where Smalltalk had a role similar to VB/Delphi and later .NET), and dropping Smalltalk consulting altoghether.
The commercial versions for Windows and Mac were not all "insanely expensive". Apple's MCL was sold for under $1000. Later the price was reduced even further. That's expensive, but far from a commercial UNIX workstation Lisp, which would be $4000 or more. On the PC there were also mid-priced offerings like Allegro CL, Golden Common Lisp, ...
Still memory and early PCs were expensive. Like a good Compaq system or an Apple Mac IIfx / Quadra. A dev seat could easily cost $10k or much more with RAM, Disk, large screen.
He certainly implies that he thinks Common Lisp (and maybe Dylan) would have been as popular as Java is today if the lottery had played out differently.
I wonder if anyone else feels this way. The programming world would be a different place.
I can’t help but fantasize about how incredible eMacs would be by now if that had been the case. (Not that it isn’t already incredible.)
I think S-expressions are fundamentally too difficult to work with, without good editor support, to make Lisp anything other than a niche power tool for the highly motivated. I love CL, Scheme, Clojure, Janet, Fennel, etc. but I don't blame anyone for finding them inaccessible.
I think Python, JavaScript, and Go have caught on for somewhat similar reasons, compared to Java and C#: they are easier to get started with, and you can still scale them up to fairly large applications.
On the other hand, I think S-expressions are a sadly underexplored format for data serialization and storage.
I have never really understood people‘s aversion to parentheses. Many languages require them in ad-hoc ways to groups expressions. For S-expressions, you give up some mild convenience and looks for extremely consistent semantics.
The syntactic consistency is at the same time its downfall: everything looks the same. (Functional languages like Haskell have a similar problem.) It’s more accommodating to human cognition if at least some part of the complexity and diversity of conceptual constructs in programming is represented in the syntax, meaning a certain range of different syntactic constructs that humans can attach meaning to. There is a sweet spot in syntactic diversity that matches the middle of the bell curve across programmers of how human cognition can absorb code. And that sweet spot isn’t on the more “spartan” side where Lisp resides. A different extreme are the more “cryptic” languages like APL and descendants. In a certain way, everything looks the same in those as well.
P.S.: The diversity in programming language design mirrors the diversity in how humans' minds work. Some people work better with dynamic than with static typing. Some people work better with Lisp, some with Haskell or ML, some better with Java and C#, some with APL and friends, etc. There is no one-size-fits-all, and one factor in the adoption of each language is how each programmer's cognition matches the respective kind of language, and also a little bit what they want to achieve practically with the language.
On the one hand it is important to have discussions on the benefits and drawbacks of different approaches in language design, and some people will change their minds based on that, but it's also an illusion that one single approach will ever suit everyone.
Human attributes are generally distributed on something like a bell curve. Programming languages with "extreme" characteristics in any direction will almost inevitably be stuck with niche popularity, because they will only be appealing to a small fraction of people. Whereas, even "extreme" people will generally be able to tolerate at least a couple of conventional languages.
At issue in this thread is the very definition of extreme vs conventional. The idea in this thread is that the languages now considered to be conventional could have been the niche syntax.
I can understand this argument. However, at the same time, S-expressions do allow for rather expressive function names not usually available in other languages. various arrows, question and other punctuation makes, etc. can make for some nice function names.
I do agree with the sentiment to some degree though that it all looks the same. In an aesthetic sense, there is something I do like about it though.
Being a fan of both ML and Scheme, I have actually been experimenting with a syntax that is a hybrid of the two. I’m still in the experimentation phase, but it’s an interesting exercise.
Yes, but those are effectively library conventions, not fixed properties of the language. Maybe more importantly, names aren’t a structural construct, like a for loop or a class definition. And, arguably, you can have similar syntax in other languages, like e.g. `!` in Rust or the type sigils ($, @) in Perl, except that here you can be sure of their semantics.
The naming is enabled by the way S-expressions work and aren't merely convention. Handling such syntax in non-parenthetic languages is difficult because of the overlap with the other meaning of those infix operators in the language. For example, in F#, if I have a function named `string->number`, then the parser doesn't know if I mean `(string) - (>number)`, `(string-) > (number)`, `(string) -> (number)`, or the function `string->number`.
I totally understand. What I meant was that the meaning of such names, or the names chosen for a given meaning, is a convention of the library (or set of libraries) that defines the names, and not a semantics the language guarantees, like keywords and punctuation do in other languages. (In Lisp I don't know what `string->number` does any more than I know what `stringToNumber` does.) So S-expressions give you more freedom in the syntax of your function names, but they don't provide more kinds of constructs in the language.
> I have never really understood people‘s aversion to parentheses
A common refrain from Lisp family language enthusiasts. (Personally I prefer Smalltalk-style "conversational" syntax to Lisp-style or C-style syntax, yet it's similarly unpopular.)
For better or for worse, the vast majority of programmers seem to have voted with their feet for C-style syntax (C/C++/Java/JavaScript/etc.) and its variants (Python). In a C/C++/Java/JavaScript world, C syntax has a fair amount of leverage.
It's a shame that Dylan didn't continue, as it seems to have taken a decent crack at adding algebraic syntax to Common Lisp - arguably realizing McCarthy's original vision of infix M-expressions as a more programmer-friendly syntax. I wonder how hard it would be to add a Dylan-style syntax layer to modern CL?
Be that as it may, one can readily write Lisp/Scheme-style programs in JavaScript. See Crockford's "The Little JavaScripter."[1] The major omission/deficiency being macros for JavaScript syntax.
My favorite languages are MLs, particularly F#, which usually have some of the cleanest syntaxes available. It’s not just Lisp-family enthusiasts. I think it’s more for anyone that thinks rather that follows or assumes or whatever.
I actually learned about Lisp/Scheme after I had learned about several other languages.
What do people not like about parentheses? It’s normally along the lines of “well, I just don’t like them”, which can be read as “it’s not what I’m used to”. Yet, these same programmers will freely throw parentheses around expressions in an ad-how manner to communicate with the compiler in their given language. In some ways, one can back Lisp/Scheme out of popular languages by stating that instead of letting parentheses be used to frequently but not always clarify precedence and grouping and function application, they are going to be required to enforce these. Such a thing is not such a radical or unreasonable stance.
McCarthy’s original M-expressions are somewhat irrelevant. He wasn’t trying to make a programming language for software development. He was investigating it as a tool for his research.
My primary complaint about popular languages like Python, C++, C#, etc. are their irregularities. There a lot of syntax, semantics, and little quirks in the languages that make it very hard to just know the language and move on solving problems. My favorite thing about Schemes are their regularity. The social popularity of languages can’t be explained by their technical merits. It’s more due to social and historical phenomena than a technical reason.
The problem is precisely that too much regularity makes the code harder to scan visually. The irregularity of various delimiters and separators helps to distinguish certain language elements from others.
There is power and elegance in syntactic uniformity, but Lisp in particular is hard to read until you train your eye to see through the nest of parentheses, and it's hard to write without something like Paredit assisting you.
Meanwhile I don't think anyone finds Haskell/ML syntax hard to read. The problem is more that, with idiomatic Haskell in particular, too much abstraction, currying, and un-descriptive variable names can obfuscate what a piece of code actually does.
Lisp doesn't have programmer-unfriendly syntax. It's just tailored to a different use case: built-in ever-present meta-programming. The language was designed to compute with symbolic expressions as data and it was early discovered that Lisp programs itself could be treated as symbolic expressions. Code as Lisp data. People found it easier to do it all in s-expressions, compared to the mixed syntax of the early design: m-expressions for programs and s-expressions for data. This was shown over and over.
In that space it has some local optimum of programmer friendliness. It's just that most programmers don't have that use case. Some get the ideas of list processing as a practical programming interface.
Alternatives were spawned, though: LOGO as a beginner's Lisp, ML as a functional programming language, Dylan as an Scheme+CLOS targeting the same audience as Swift nowadays, ...
As a Guix developer (there's a lot of Scheme and Common Lisp in Guix :) ), every few years I check on Open Dylan but so far each time there was something missing that made me unable to package it for Guix (building it entirely from source).
I've checked it a few minutes ago and Open Dylan's ./configure says to download a bootstrap compiler from https://opendylan.org/download/index.html first -- but there's no such bootstrap compiler there.
The way you actually have to do it is trawling through git repos to find an old version of Gwydion, bashing at it for a while, and maybe getting it to compile on modern platforms. Look for 2.4, then rewind, then pass a config flag, and a few more steps.
I wouldn't call Dylan thriving, unless you're a Windows user that really doesn't care about bootstrapping your language.
It's alive though, in the same sense that Miles, the dog that Segall froze and then brought back to life in 1987, was alive. Brain damaged, but alive.
You can't bootstrap OpenDylan from source without a Dylan compiler. You can bootstrap Gwydion without one with substantial elbow work, which you can then pivot into bootstrapping OpenDylan.
I know what I was doing, and if you look elsewhere in the thread, I was right about what the GUIX maintainer wanted. Not to get egg on your face.
There's no special "bootstrap compiler"; you can download a binary release for your platform from the page you linked, and then use that to bootstrap a newer version from a source checkout. If it would help, we could provide a minimalistic build that was only useful for bootstrapping, but at present our builds are "batteries included" (with LLVM/Clang and the BDW garbage collector already provided in the tarball).
So for Dylan, we'd like to have a compiler for Open Dylan, not written in Dylan. (it can also require multiple steps to get up to Open Dylan--that would be fine)
It's not in the interest of our users to use binary blob compilers for bootstrapping.
We would also unbundle LLVM, clang and the bdw gc and use the ones from Guix.
I, for one, welcome my whitespace-surrounded operator overlords, but if you really don't want that you're welcome to use function-call syntax for them.
In high level yes, however Dylan was designed as systems programming language, so it also enjoys AOT compilation and low level stuff that Julia currently isn't able to.
I've been working with Lisp for over twenty-two years and still got tripped up recently by something that was
(if (condition) ;; <- must be when, not if!
(do-this)
(do-that))
Phony complaints about problems with editing parentheses and getting them to match are just trolling nonsense, but Lispers should acknowledge this kind of problem: when your editor has already helped you ensure that the code has valid syntax, everything is beautifully indented and compiles without diagnostics, but you have a mistake like this: parentheses being closed in the wrong place, not including something, or the wrong operator like the above.
There are similar problems in other languages; no notation is perfect, and attempted cures for some of these problems can be worse than the diseases.
GCC's relatively recent "misleading indentation" warning is a good example of a cure that has no downside (that I can quickly think of). It can catch code like:
if (condition)
do_this();
do_that(); // not part of the if statement, but indented deceptively
No language save you from writing a program similar to the one you should be writing, but which is correct for a different set of requirements relative to what you want. Just verification and testing.
(if (condition) ;; <- must be when, not if!
(do-this)
(do-that))
I didn't find that to be a common problem when writing Emacs Lisp at least. The "misleading indentation" warning is relevant thre. If you enter the above code into Emacs, auto-indenting as you go, it will indent like this, making it obvious you meant `when`:
It's not so much an aversion to parentheses as an aversion to nothing but parentheses. Other languages use other brackets to indicate different contexts, but s-expressions can more difficult to read. The idiom of putting all of your end parens on a single line doesn't help.
And for most people, the mild convenience of readability is preferable to the consistent semantics.
What’s the problem with having the end parens on a single line? I never understood why languages with curly brace syntax don’t do the same. A line of code containing nothing but a curly brace seems like a waste of screen real estate to me. It contains absolutely no information (the end of the block can be seen by indentation anyway).
One could just as easily write s-expressions using whitespace, so why bother with parens at all?
Because the human brain is a pattern-matching organ and sometimes redundancy in a signal is useful. I've never understood the lispers' insistence that saving "screen real estate" is a primary concern. Source code is meant to be read, and the more easily code is read, the more utility is has.
Parens allow an editor to create the proper indentation.
You can think of Lisp's paren syntax as the syntactic analogue of Lisp's very simple static type system. To get macros to work with a conventional grammar's complexity is much more annoyingly complex.
Source code is indeed meant to be read, and at least to me, low information density seriously hurts readability. I prefer both Python-style indentation based syntax and multiple parens at the end of the line over C-style syntax with closing braces alone on a line. If I write code in a language with curly brace syntax and I don’t have to share the code with others, I prefer to use multiple closing braces on the same line – like Lisp parens – because of improved readability.
The syntax is consistent, I don't know about the semantics. The trade-off is that it becomes more difficult to glance at a piece of code and visually distinguish what's what.
I've ever understood it either. I love s-expressions!
But I suppose there must be something to the complaint.
If you ask me I'd say Python's space-as-syntax is far more horrible, but doesn't get nearly the same amount of vitriol.
I find this syntax argument a bit strange nowadays.
There's this extremely popular Kubernetes tool called Helm, which uses go-templated YAML to generate Kubernetes resources.
The template syntax is nothing short of horrible. An endless maze of {{ ... }}s, where it's just too easy to shoot oneself in the foot by using {{- or -}} in wrong place, etc. And on top of that, you literally need to count spaces for nindent (mind you, you don't need to count parentheses while writing Lisp code).
I'd say Lisp syntax is clear as day when compared to your average Helm template.
But, you always have good editor support. Lisp/Scheme has traditionally been coded using Emacs or its equivalent. With that support, the parens seem to disappear after you've been writing lisp for awhile.
Beginner lispy languages come with a suitable beginner IDE, e.g. Racket or Logo.
A typical lisp statement has a leading '(' and a trailing ')', which replaces the ';' in Algol-like languages. So, one additional character, which brings a lot of benefit when writing DSLs and macros.
(Python doesn't have the ';', but its whitespace is ambiguous, so that's not a fair comparison.)
I tried using Slime with Scheme when I was doing SICP. I’m sure it’s great when you know it well, but I kept pressing the wrong shortcut and moving things around in ways I didn’t intend and then it wouldn’t let me manually type or delete parentheses to fix things. I found it very frustrating and ended up turning it off.
Just another barrier and C-style languages seem much easier to write in comparison.
Slime did that? Slime doesn't take over the editor that much (it shouldn't prevent you from typing parentheses, I've used it for 16 years or so and never experienced anything like that). Was it paredit or another editor mode?
I guess my point is that syntax that works very well with a correctly configured editor used by someone who knows what they’re doing is going to have a higher barrier to entry than a language that can be easily edited in Notepad.
Lisp is no harder to edit in notepad than any other language. It is just a text editor and lisp source is just text. And paredit is a misfeature for most people, turn it off. That should have been the real takeaway. It is not part of a ”correctly configured editor” and it isn’t installed or activated by default anyways so is easy to avoid.
In case you didn’t know — DrRacket (IDE for the how to design programs book) has a language directive you can run that sets up DrRacket as if it were the environment the sicp authors expect the reader to have.
I only found out after going through SICP using chez scheme’s repl (which I had to compile myself).
I think the parent comment is alluding to the notion that the very entry level programmer enjoys being able to use notepad. That is to say the initial barrier of eMacs is just too much. Obviously worth it in the long run though.
My autocorrect has been doing that for some reason and I assumed there was logic to it. Your comment sent me on a dive that proved my intuition woefully incorrect.
Lisp syntax is an extreme impediment to uptake. Consider what happened to Lisp-Stat. It was a really slick package, but people immediately dropped it when R became available. Lisp defenders like to think that R won out because of better libraries, performance, or some such. The truth is that the stampede began while R was still in a rough state, people were so eager to get away from Lisp.
Is it clear people wanted to get away from Lisp rather than use a more-or-less clone of S? (One of R's originators did propose moving back to a Lisp base [1], though part of the justification was HPC, which is now at least partially covered [2].) However, much as I like Lisp, I do want a mixfix syntax for interactive use like R, but something CGOL-ish is easy enough; I don't remember xlisp-stat having something like that.
> Is it clear people wanted to get away from Lisp rather than use a more-or-less clone of S?
That's a good point. How many statisticians were familiar with S but were using Lisp-Stat just because it was free? I don't seem to have traveled in those circles. I wasn't paying a lot of attention, but I don't remember encountering that on campus.
Another slick Lisp package that got too little attention was Yann LeCun's Lush.
> The truth is that the stampede began while R was still in a rough state, people were so eager to get away from Lisp.
That's not a big surprise.
The syntax is/was an important reason. Most users of statistics / maths software prefer a more math-like notation. Even the bigger Lisp applications in the maths area provide that always: Macsyma/Maxima, Reduce, Axiom, Derive, ..
One could have written that on top of Lisp, but it was much more convenient to develop a specialized language/runtime using C. C + Lisp + R would have been more complex than C + R.
Also if the first Newton MessagePad didn't have less than 1 MB RAM. ;-) The first Newton had 640kbyte of RAM and a 20 Mhz ARM Processor. It was the result of bringing down the cost. It wasn't well received, since the handwriting recognition was bad and the machine was underpowered. The 130 model fixed the latter. In later OS versions the handwriting became useful, eventually.
I had later the wonderful Newton MessagePad 2100 with 8 MB RAM and a much improved ARM processor - I used Common Lisp on a Mac with that amount of RAM.. The 2000/2100 series would have been a perfect platform for Dylan, it even would be good enough to run a nice garbage collector.
When they discontinued the platform, we knew that it would take a lot of time, years!, for Apple to come up with a new similar mobile platform, but it eventually happened with the iPod touch and the iPhone. But those were then programmed in an object-oriented C dialect and the operating system wasn't as cool as the Newton OS.
I agree. I bought a Newton after having lunch with Larry Tesler (John Kona also dined with us, a fun lunch!). Larry was pitching me on converting my first Common Lisp book to Dylan.
I ended up just tossing my Newton in the trash. Sad.
Sun Microsystems poured millions of dollars into marketing Java. During the dotcom bust billboards, posters, magazine ads, and online ads where common.
This massively accelerated it's adoption curve and built a valuable trademark which would be part of the reason Oracle paid $7 billion to buy it.
This is all true, and it should be noted that at the time of its release in mid 90s, Java was an incredibly fun toy to play with and adoption was enthusiastic, and at least in my case, it was an immediate decision to say goodbye to C++ ("forever" /g) and adopt Java, which also turned out to be a good bet in terms of jobs.
Let's also note that to this day I (or any Java professional) can go and earn a living writing Java and not feel like I'm writing COBOL for a mainframe modulo the Spring framework (which I have avoided, to date.)
SMI did not merely pour money in marketing Java. A lot of loving care by very competent software engineers, some of the best, went into creating Java and its virtual machine. The sweet spot of this language is phenomenally large, imo as an s/e, and accessible to an equally large subset of the programming community (even if they hate it, they can do it), from IT low end to investment banks and up to academic people doing super cool stuff like adding fibers to Java (Kilim). Same story holds for performance. Only on the GUI front did Java drop the ball.
Java is, entirely on its technical merits and utility record to date, one of the most practically effective languages created. Thank you Sun Microsystems.
Also thank you Oracle for buying Sun, when no one else cared about it, for improving Java beyond version 6, bringing MaximeVM out of Sun Labs research into GraalVM research labs, integrating JRockit JIT caching and VM monitoring into OpenJDK, and caring about AOT compilation when Sun left it for commercial 3rd party vendors.
Honestly it would have turned into something like Java if it had become popular in the industry. OOP and functional programming are really just different ways to encapsulate data and control dispatch. They don't change the fundamental nature of the work.
Ultimately what happened to Java was the same thing that would have happened to any other dominant language that the industry uses to build heavy duty enterprise software: stuff like ORM frameworks, data transfer objects, servlet containers, some kind of web integration like JSP, RPC frameworks, SOAP, REST, etc.
Would it be easier or better to do that stuff in Lisp? Maybe. But not by a huge amount. The beauty of the language would certainly end up being obscured by the boring, complex, practical work that we would all be doing with it. And most programmers would not be better than they are now -- instead they would force an imperative model on top of whatever substrate they are given, just like they do today.
The Common Lisp object-system does not use the more imperative message-sending / virtual method calling. It favors "generic functions of related methods" instead - thus this is kind of an integration of function-centric programming into the traditional view of OOP where methods belong to classes.
Common Lisp is multi-paradigm and can be written to write functional code. It is the direct descendant of LISP which was the first functional language after all. Yes, Scheme and the like are more functional but CL is still far more functional than most other languages.
Imperative relates to coding style - I can equally write imperative or recursive code in Java.
"Since then, my focus has been on AI research, and part of that involves the design of planning systems that (maybe) can eliminate the need for most programming. You tell them what you want, as you might describe it to a smart grad student, they maybe ask some clarification questions, and then they go do it. That only works if they know enough about the world and have enough "common sense" to avoid blunders."
I wonder what he's working on to enable that kind of environment. Thinking about how most of the programming I do (in high level languages) comes down to starting with a pretty rough idea and iterating on the idea while learning more about what's feasible / effective, typically refining what it is I'm building rather fluidly. I rarely know exactly what to build upfront, nor do the CEOs, product managers, designers or sales people I've worked with so far. I think this feed back loop goes far beyond "asking a few clarifying questions".
Maybe the kind of task a professor would give to a grad student is entirely different from that kind of typical industry work?
If you can formulate your problem in the sense of classical planning problem, there are different sets of planners that can solve the problem. These methods are explainable. Some of them even give the optimal solution to your problem if you use optimal heuristics. This is what he means by that statement. The only issue is, right now the classical planning is hyperfocused into few problems and most of the methods work on narrow domains like path finding, logistics, constraints based problems and so on.
Usually grad students get projects to investigate or prove a particular idea. It’s especially straightforward to do this when the advisor is iterating on a line of ideas, and you resume where the last grad student left off. There isn’t a point in building anything other than to test the idea, so you don’t want feedback to bias you in what direction to go.
So I guess we’re just not gonna talk about that teaser for that AI that says we don’t need to program anymore? Sounds like a symbolic local version of GPT-3
"What I came to understand, after years of work on Common Lisp and the death of Dylan, the ongoing popularity of the hideous C++, and the rise of Java, is that programming languages don't become mainstream based on their elegance or their deep utility. ... [T]he best languages very rarely take over, if ever. Some language starts being used because it has the backing of some big company or project, people doing similar things use it as well, positive feedback sets in, and soon it is the language everyone is using. Java is an example -- not nearly as good as Lisp on many dimensions, but it appeared at the right time for a language with some good properties for creating downloadable Internet apps. And it had the backing of Sun Microsystems, at a time when Sun was powerful. So Java became mainstream, while Common Lisp and Dylan faded away."
Java and C++ had capable, even enterprise quality free compilers at a time when Common Lisp environments were typically either hideously expensive or poorly supported or both.
You could grab Sun's JDK for free, or GCC, or (later) Visual Studio Express at a time when the best options for Common Lisp were thousands of dollars per seat, or barely working on x86 (CMUCL didn't have x86 support in any capacity until after 96, and by then Java was out).
Personally, I loved lisp in the 90s, but it was just so much easier to get my hands on C and Java, and libraries to go with them. Java, C and C++ were actually available on DOS and Windows. Even today, Windows support among lisps is terrible!
Edit: I miss Turbo C, but had there been a Turbo Lisp, priced affordably and on the shelf at radio shack? We'd all be programming in lisp.
Similar forces hampered the adoption of Smalltalk in the 90s as well [0]. Charging per-seat licenses to use programming languages almost never works, introducing friction that counters needed network effects, especially when prospective users can download a perfectly viable (if not much worse) alternative for free. Sure, there are a few systems bitterly hanging on to this business model - Lispworks, Cincom, Mathematica - but it's clear they are surviving in spite of it.
As I wrote in another thread, IBM abandoning Smalltalk for Java was the biggest blow, in a way Smalltalk was IBM's .NET to put in more modern terms, used for their IDEs (Visual Age), OS/2 RAD development and 4GL like language across all their enterprise offerings.
Until the day Java came, and they pivoted everything into Java.
On the Lisp side this is an extra step and a source of problems. Traditionally the commercial Lisps REQUIRE one to ship an application without the full dev environment. So there is always a delivery step which generates an application or a shared library. Some other Lisp applications require a delivery Lisp compiler, usually to C because of some other limitations: space, memory management or the platform does not allow a full dev Lisp. Example: there is a process control engine written in Lisp, which is used in industrial control applications. This Lisp application is largish and gets compiled to C and runs without garbage collection.
I think that is just a facet of his point. It was Sun's corporate backing and muscle that enabled you to get their SDK for free. And this is ignoring the reach out into education markets to get Java as the 101 level course for an absurd number of students.
> It was Sun's corporate backing and muscle that enabled you to get their SDK for free.
You could get GCC for free before Java 1.0 was released. My personal introduction to GCC was Walnut Creek's Hobbes OS/2 CD-ROM from 1994, the EMX port. Paid for the CD-ROM, but we could have downloaded it for free if we had Internet access. EMX did DOS too, and DJGPP was also available at the same time (and ended up overtaking EMX under DOS).
That was indeed where many of my teenage hours went - hacking out game experiments in DJGPP and Allegro on our family’s 486. It was also my intro to Emacs which stayed with me for decades afterwards.
GCC being freely available didn't require massive corporate backing.
So lack of massive corporate backing is insufficient as an explanation of Common Lisp's lack of free availability (on then-mainstream platforms) circa 1995.
I think the real explanation is, at that point, the Common Lisp community mostly didn't get why it was important. And by the time they got it, it was too late.
> GCC being freely available didn't require massive corporate backing.
But it had. Most operating systems were written in C. Lot's of supporting libraries were written in C. Stallman chose to implement a C compiler because of that. There is and was a huge eco-system around C just because it is a low-level language used to implement the basics of many systems.
Lisp was only accidentally used for a few operating systems - the AI programmers needed capable workstations, there were none and thus they developed their own and wrote their OS in a language they knew best. Outside of that community C was (or became) the standard for implementing the low-level and application environments on top. Most Lisp implementations contain a runtime written in C (and assembler). Apple tried to write early OS stuff and applications in PASCAL. C and then Objective-C replaced that.
Stallman also chose to build Emacs and Emacs Lisp. He could have made an editor extendable in c only, but did not. Emacs' problem, then and now, was non-existent, then terrible DOS and Windows support.
Really, the problem was that there was no DOS or Windows lisp available for cheap, or free. Even GCC was only available thanks to a small company, Delorie, who made djgpp.
I would have greatly preferred to program in lisp; but on DOS and Windows all I had was C, Pascal, and Basic; thanks to Borland, Delorie and Microsoft. Later I had Java, Python and Perl; again thanks to companies like ActiveState.
The focus on UNIX-only kept these other languages, free or not, from being useful to the broader public.
> Emacs' problem, then and now, was non-existent, then terrible DOS and Windows support.
I don't know when GNU Emacs first became available under DOS. If you look at the GNU Emacs 18.59 source distribution (October 1992), while it doesn't support DOS/Windows, its FAQ mentions a DJGPP-based port to 32-bit machines called Demacs (which wasn't pure GNU Emacs, it was actually based on the Nemacs fork which added improved Japanese language support).
The many limitations of MS-DOS (8.3 file names, no multitasking) meant that it was essentially impossible for GNU Emacs under MS-DOS to work as well as it did on more capable platforms. Windows 3.x/9x/Me laboured under many of the same limitations – 9x/Me didn't have a 32-bit command line environment, all it had was DOS boxes and COMMAND.COM, plus some horrible kludge by which 32-bit console applications would have their I/O routed through a DOS process (CONAGENT.EXE) via a VxD.
While NT-based Windows fixes many of those problems, it doesn't fix all of them – for example, until relatively recently (some Windows 10 build), Windows had no pseudoterminals (except for various flaky unofficial workarounds), which put big limitations on Emacs support for subprocesses compared to other platforms. Now at last it does, but I'm not sure if GNU Emacs has been updated to support them.
IIRC, Stallman's original mission was to write a 'free' operating system (-> GNU Hurd). For that he needed a C compiler, an editor, a Lisp, etc.
Integrating GNU Emacs into proprietary operating systems like Microsoft DOS / Windows or Apple's MacOS wasn't his priority.
There were a bunch of other Lisps on DOS/Windows: Xlisp, MuLisp, CLISP, EcoLisp, RefLisp, LinkLisp, Corman Lisp, Golden Common Lisp, Procyon Common Lisp, Allegro CL, Medley, LispWorks (later), NanoLisp, Software Engineer, Star Sapphire Common Lisp, ...
Maybe they were too late, didn't fit your requirements, etc. But it was not that there was none.
GCC being freely available is by far the exception, historically. Such that I don't really get your point. Do we know why/how GCC was able to pull off being free?
And, as I said in a sibling thread, my assertion is that it took massive corporate spending for Sun to get inroads with Java. In large because they were competing with GCC.
> Do we know why/how GCC was able to pull off being free?
Because RMS is an ideological obsessive for whom software being free was more important than making money. Although he wrote a major Lisp implementation himself (Emacs Lisp), he saw C as the more pragmatic path to making free software widely available than Lisp. Also, as a former employee, MIT gave him a lot of (in-kind) support.
(Actually GCC was not the first free compiler. RMS started out planning to turn LLNL's free compiler for a Pascal derivative, Pastel, into a C compiler. But he abandoned that approach, because the Pastel compiler was a big resource hog by the standards of the time – it was initially developed to run on a supercomputer, it wouldn't work on the Unix workstation RMS was using.)
CMU Common Lisp was free, and CMU's research funding paid for it. But their focus was on Unix workstations, they never saw the value in porting it to x86 (let alone DOS/Windows). By the time the x86 port happened (done by volunteers, not paid for by CMU), Java had already taken off.
Hardly anyone cared about GCC in those days, most compilers were still commercial.
In fact, GCC only started to matter to UNIX folks, because Sun introduced the concept of user and developer UNIX workloads, and all UNIX vendors followed along, making the UNIX development tools into an additional package one had to buy.
It was popular with people who came from a Unix background and wanted to port Unix software to DOS/Windows (or even OS/2). While the API differences were still a challenge, GCC did better at code ported from Unix than Borland/Microsoft/etc did - especially since a lot of that software was already being compiled under GCC on Unix anyway
True that Borland/Microsoft was more popular among professional DOS/Windows developers, especially those for whom that was their native platform
> those that could not buy them, would get copies from a street bazar
Teenage me didn’t know of any “street bazaars” selling Microsoft/Borland developer tool warez. But I walked into the local computer shop and saw some modestly priced CD-ROMs, with GCC (among other things), and convinced my Dad to buy them for me.
I think you are talking past me, though. My claim is that it took massive corporate bucks for Java to get inroads. Specifically because it was competing with some free options.
Common lisp had nothing like that. Nor did Ada. Or... really, any other language? Microsoft did a pretty heavy push with C# and the general .NET ecosystem. Though, even they had to resort to destroying a lot if the VBScript world that had proliferated quite heavily by just being available on tools that folks otherwise had.
Java was free, too; and at the time, djgpp was effectively the only free c++ toolchain on DOS and Windows.
Common Lisp wasn't even available to DOS and Windows users, let alone free or corporately backed. Even now every single free common lisp has substandard or outright broken Windows support.
Common Lisp didn't bother to show up to the party. That's the first problem; it was priced terribly, when it did become usefully available.
(Free) LISPs on Linux today look a bit rough, is there any implementation that turns Linux into something that has the look and feel of a LISP machine?
Parentheses aside, don't expect people to pick up a language if all they can see is a REPL where the only edit operation that works is DEL(ete), at a time when all other languages have graphical IDEs with syntax coloring, context-sensitive help, single-step debugging etc. (and no, I don't mean Emacs).
No, there is not. There's no desktop environment for Linux that supports the runtime extensibility and inspection abilities of a symbolics lisp machine.
Most Lisp systems provide a naked version which runs in the terminal without assuming much. Development environments are loaded on top. Thus one would use SBCL typically either with GNU Emacs and the SLIME environment for it.
Alternatively there is something like: McCLIM and SBCL. McCLIM provides a similar user interface management system to what Symbolics used.
If one uses something like LispWorks or Allegro CL, they have their IDEs integrated and one can typically start them directly into the IDE.
CMUCL was his baby and he was responsible for that. But it addressed UNIX systems. Those usually had not x86 processors at that time and Windows was not very interesting as a platform then.
Windows (and Macs) were never a good place for free Lisps for, since the support of those on proprietary platforms was costly (in terms of knowhow, time and money) and there was no vendor support. Users on those platforms usually expect support for the vendor libraries AND full GUI support. Microsoft and Apple always pushed their own development tools&languages (Visual Basic, C, C++ and Pascal, C/C++, Objective-C, Swift). The Lisp systems used on Windows and Mac were mostly commercial, since they had GUI and platform support. MCL on the Mac and on Windows then Golden Common Lisp, Allegro CL, ...
CMUCL could have been ported to x86 earlier, I was running UNIX on a 386 in 1987 but only had AKCL for it. I was the first beta-tester for CMUCL on FreeBSD in 1996.
ports usually don't happen just so. If there wasn't one, I would interpret it such that there was little demand at that time.
SBCL later was forked from CMUCL to have a simpler implementation and build system. Recently the port to M1 Macs went smooth. The person(s) doing the port did an excellent job.
Porting CMUCL to x86 required someone having access to a machine that it already ran on as well as a 386 running UNIX, plus knowing that the CMUCL sources were available and having internet access to get hold of them. I think these were the limiting factors rather than lack of demand.
I transferred from community college to a CSU in the early '00s. Our high school was using Pascal, the community college I went into was using C++, and the CSU I eventually transferred into recently transitioned to Java, and they were using C++ before then.
When we used C++, the education version of Visual Studio was about the price of a text book - so not very expensive considering you would use it across many classes.
I assume the schools followed industry rather than the other way around. But thinking about it, Java had made its way into at least one CSU, as the primary language, about 3-4 years after its 1.0 release date. That is amazingly fast.
I started college in 1988 and they were offering copies of Turbo C with all the books for $5, but you had to supply the 5 1/4" disks.
I only lasted 1 1/2 years in college (that time around) so I didn't get to experience Unix or lisp or anything like that. Well I did take an IBM 370 assembly language class. Had to do the assignments on dumb terminals in the basement of the dorms and pick up the print outs in the basement of the CS building.
If this were the case, then explain StandardML which is vastly superior to Java and it's relatives in every conceivable way outside of a big corporation to push it.
Even today, a large company like google would rather spend time with an inferior language like Go than work with SML.
Available libraries are another factor. C and C++ can directly call all OS functions. Java very early had an extensive runtime library supporting networking, GUI programming, multithreading, collection classes, localization, date/time, etc.
agree. this has also been the problem fore smalltalk. before things like scratch and pharo came along, commercial smalltalk licences were eye-wateringly expensive.
It is sad. But I feel some of the sibling comments about availability
are correct. My love of Lisp went unrequited for may years. Faffing
around with Steel Bank and Common Lisp environments that drove me
crazy, when I could just type
cc myprog.c
and get stuff done (fast but unsafely)
The Scheme we had at college (which ran on some Sun machines IIRC)
just wasn't available in the wild once I left school.
The money SUN poured to support Java helped a lot, but the promise of being able to run the same "binary" on every platform (and running faster than perl, which was the other write-once-run-everywhere at the time) was also very attractive.
> programming languages don't become mainstream based on their elegance or their deep utility. For any given project, the best programming language to use is the one everyone else is using for that kind of programming at that time. It doesn't have to be the language that is best or most beautiful, and it hardly ever is. As long as the currently-dominant language is adequate to the task without TOO many infuriating shortcomings, just use it.
Unexpected advice coming from one of Common Lisp's creators. He goes on to talk about how languages usually become popular because they have the backing of some large company. Things haven't really changed. Rust and Golang both at least had some sort of backing company to get them started.
C was the implementation language of AT&T Unix. The operating system spread across universities in the 1970s due to AT&T’s inexpensive licensing to them, including source code. Unix, though its variants and clones, became the dominant operating system with most competitors either dying out or adopting Unix.
Both C and C++ also had the benefits of being the “blessed” languages of Windows (for example, Win32 is a C API, and MFC is a C++ API). Even in the days of the classic Mac, while Pascal was originally the language of choice, eventually in the 1990s C would be “blessed” by Apple.
Python I think got a major popularity boost by becoming the de-facto standard language of the natural sciences, machine learning, AI, and data science. That, and being a substantially more accessible alternative to Java, and arguably to Ruby as well.
C++ had a whole host of compilers and tooling/IDEs pushed by large private companies (Microsoft, Borland, Intel). I think that somewhat qualifies as institutional backing of another sort.
The initial push for C was that most Unix systems arrived with a C compiler included (there were exceptions ofc, and once the compilers started going pricy there was a lot of volunteers for GCC).
Microsoft and Borland both were companies that made money on selling developer tools (it's been overshadowed by OS and office software for MS, but Microsoft had a lot of business selling compilers etc.) so of course they sold compilers for languages that got popular (and in case of MS, later used the compilers to prop the environment for developers building software for their OSes)
INTERLISP and Symbolics were even worse - you were always in the development environment and couldn't get anything out except a saved state dump.
Eventually the LISP crowd got it and started generating executables, but it was too late by then.