Hacker News new | past | comments | ask | show | jobs | submit login
Books Programmers Don't Really Read (billthelizard.com)
116 points by signa11 on April 27, 2010 | hide | past | favorite | 71 comments



> The Dragon Book covers everything you need to know to write a compiler.

No it doesn't because it only covers techniques you can use in implementing and optimizing old-school static languages.

But today a language is more than just the compiler, with modern VMs coming into play, and runtime optimizations aren't covered ... for instance if you want to implement a tracing JIT compiler, your only hope is to read the source-code of some available VM. Or if you want a precise garbage collector ... well, good luck with that.

That's one reason the implementations of dynamic languages are so shitty ... the effective techniques for doing it are yet to be explored / documented.

So if you want a book on implementing compilers, I recommend this one ... http://pragprog.com/titles/tpdsl/language-implementation-pat...


My problem with the Dragon Book is more fundamental: having once made a real effort to read it, almost cover to cover, I came away with a few hours less of my life remaining and some ideas about lexers and parsers and the like, but I still had absolutely no idea how to sit down and write a compiler. I don't understand why anyone thinks this is a good book.

There are introductory texts that might not cover the same amount of theory, but at least get you from nowhere to being able to write a basic compiler from start to finish. If you do want some serious theory, particularly on optimizations, then Muchnick blows the Dragon Book away. If you want to work with different kinds of programming language, notably functional programming with a lambda calculus foundation, then these ideas barely appear in the Dragon Book and again there are specialised texts that would serve you much better. What is the target audience for the Dragon Book actually meant to be?


The Dragon Book is supposed to be read by people taking a CS course on compilers, to fill in the gaps. I too don't believe it's a standalone work.


I found Appel's _Modern Compiler Implementation in ML_ to be better-suited to self study, and more modern. I've used OCaml but not SML, and didn't have any problem following it.


This is a wonderful book!


Well, when I wrote a compiler (on a team of 3) the dragon book (first edition) was very helpful, from lexers, parsers, to code generation. So we read it cover to cover (except for Jeanne, who got her PhD from Ullman and who wrote the LALR parser) and used it profusely.

As noted elsewhere in this thread, I think that, in contrast to the OPs article, everyone should read it or Hollub's excellent work that followed the Dragon book by a number of years.

So I would disagree with your opinion about this being a good book, but then I may have the advantage of having spent substantially more time with it.


Jack Crenshaw's book Let's Build a Compiler http://compilers.iecc.com/crenshaw/ is probably too far to the pragmatic end; it essentially walks you through the process of writing a recursive-descent compiler in Pascal for a Pascal-like language that generates code for the m68k. No separate parser and lexer as such, but he does teach you how to turn those kinds of ideas (regexes and EBNF equations) into code more-or-less directly in your head. In short, it was just about perfect for someone who wanted to whip up a nice little compiler for a foogol on a late-1980s DOS box without paying for or finding any tools beyond Turbo Pascal.

It's entirely possible to use the techniques in the book to, say, whip up a compiler in Ruby that turns an inconvenient data format into JSON. The real utility of doing this by hand in 2010 is, however, debatable.


I think a lot of dynamic language implementations suck because their authors haven't bothered to study Lisp implementations. The widely used implementations of Common Lisp and Scheme don't suck at all[0]. Of course, some implementors of other dynamic languages do seem to have studied Lisp implementations and it shows in the results. Lua doesn't suck at all.

[0] Ok, the multi-platform support of some of the CL implementations could be better.


I'd say they suck because the authors cared about specific issues other than making the implementation not suck (which I guess means fast performance performance?).

But how does lua bears lisp inheritance in its implementation? IIRC lua does not use pointer tagging which is a typical lispism, uses tables for everything, has no lists, and uses a different approach for implementing closures.

Are you thinking of LuaJIT(2) ?


Lua's designers have acknowledged a lot of Scheme influence, especially from 4.0 onward, but I think it's more about the semantics than the implementation.

Lua uses a type tag + union rather than tagged pointers because it's not strictly portable, and they've placed a high priority on strict ANSI C compliance (with the sole exception of dynamic library loading). Also, Lua tables recognize when they're being used as arrays and use an array internally.


notwithstanding the validity of your message, you are objecting to something that you did not quote.

If modern languages are more than a compiler, that does not denies the ability of the book to cover how to write a compiler.


FWIW, a large chunk of the dragon book is about lexers and parsers, but from an automata theory perspective, where rather than creating a compiler, you're creating a compiler compiler, like lex for DFAs creating token streams from character streams, and yacc for PDAs creating trees from token streams.

But such detailed info isn't necessary even to write a compiler front end. It's good that the treatment exists, and that it's in the dragon book, but it's not what you need, even for parsing.


Perhaps you are implying the Dragon Book is not needed for people who will use tools like lex and yacc to generate it.

While that is true, the Dragon Book gives real insight into how a compiler really works and how to code one from scratch. If someone is only interested in getting a very basic compiler up and running quickly and not interested in understanding how to really make one, a simple tutorial for yacc should suffice.


If you're really making a compiler, it's unlikely you'll use a parser generator. Most production compilers use hand-written parsers for a variety of reasons: speed (e.g. folding more logical passes over the AST into the AST construction in the parser), semantic resolution of grammatical ambiguities, flexibility for compatibility (e.g. toggling different syntax rules dynamically), etc.


My comment wasn't about the article's point (which I thought it is valid), but about one idea that keeps getting rehashed ... i.e. if you want to learn about real compilers, you should read the Dragon Book.

> If modern languages are more than a compiler, that does not denies the ability of the book to cover how to write a compiler.

True, but the best way IMHO to learn about compilers is to start more lightly and then to read lots of source-code.


I notice that Language Implementation Patterns was written by the Author of ANTLR. I've had terrible experiences trying to use ANTLR in practice, which kind of puts me off buying the book. Do examples in the book use ANTLR? Also, is there much point in getting the book if I already know about lexer and parser generators?


"Do examples in the book use ANTLR?"

Yes. The book is "saturated" with ANTLR. I wouldn't buy it if you don't like ANTLR. Also, (purely imho) it spends too much time on the front end (lexing/parsing) and not enough on the rest. I have yet to see good books (other than DCPL or L.I.S.P) which uses a simple concrete syntax (s-expressions say) and focuses on the rest of compiler building.


Thanks. I've got a pretty good understanding of lexing & parsing, so I would have gotten it for the other parts, but I won't bother.


> I've had terrible experiences trying to use ANTLR in practice, which kind of puts me off buying the book.

Yes, but it explains the parsing techniques used by ANTLR, and for learning stuff it's quite OK. It uses ANTLR as a teaching tool, but you don't have to keep using it afterwards.

For writing simple parsers you don't even need to read a book ... just use a parser combinator based on PEGs, like http://wiki.github.com/sirthias/parboiled/ and you'll be fine.

If you want a general purpose language however, even if you use a tool like Parboiled, you'll need to know how its guts work. It is also a good idea for a compiler to be self-hosting, because it's more portable that way and the compiler's code itself is a good test for your language ... so you'll need to know how to implement a parser by hand.

Of course, after the parser is done you'll need to have an AST, and maybe you'll want to optimize it ... this book is thin on such details, but it's better to start somewhere and books like the Red Dragon are quite heavy and can put you off the target before writing a single line of code.


After using JAVACC for a couple of years I switched to ANTLR. I found it slightly harder to learn but easier to use than JAVACC.


This makes two JWZ references I've handed out today, but this is pretty relevant: http://www.jwz.org/doc/gc.html

Memorable quote:

"It's a common belief that garbage collection means inferior performance, because everyone who has gotten into programming in the last decade regards manual storage management as a fact of life, and totally discounts the effort and performance impact of doing everything by hand.

In a large application, a good garbage collector is more efficient than malloc/free. "


Its merits notwithstanding, I suspect the Dragon Book was at included at least partially as a dig at Jeff Atwood.

(Disclaimer: I haven't read it myself, and don't plan to. My reading stack is close to overflowing as it is.)


Couldn't disagree more about TAOCP. I haven't finished any of them, but I think they recommend themselves poorly as references and are brilliant front-to-back readers. Every time I make a couple hours for them, no matter where in which book I start, I learn something new. TAOCP is in that sense more like the Bible than the dictionary.

That the examples are in "assembly language" is a red herring, since the machine the book targets is deliberately simplified. The "assembly" is just a notation. The idea, which seems to be to make the examples as concrete as possible, is beginner-friendly --- as anyone who's been tripped up by mistakes and ambiguities in older Sedgewick books can attest.


Grab the fasciscle (sp?) 1 of Volume 4; the bitmanipulation stuff reads like poetry.



fascicle


"I consider it amazing that some people do go cover to cover in my books. In most cases I know that people are going to pick and choose the parts that they like. But they know that if they dig further then they'll get something that has only one subset of jargon describing it instead of all different kinds of notations and terminology -- if I didn't write the books it would be much harder for people to find stuff out. That's what turns me on."

--Knuth, in Coders at Work


Like I said, it's like the Bible. Most people don't cover-to-cover the Bible. But the Bible also isn't the "encyclopedia of Catholicism". People have favorite passages. There are books in it people do read end to end. There are places, like Psalms, where you open to an arbitrary page and read either direction.

TAOCP is like that.


I agree. At first glance, their length makes it appear to be a reference work, but when you look at his writing style and the exercises at the end of each section, it seems quite clear that he intended otherwise.

That being said, I haven't finished the series.


"I've read all of these books myself, so I have no difficulty believing that many moderately competent programmers have read them as well."

What a strange thing to say.

I've read all of the books in the "Claim To Have Read" list and only a couple in the "Have Actually Read" list. So based on my experience I have no problem believing that programmers at least as dumb as me have read the same.


Another data point. Of the "actually read" list I'm at 6 of 10. Of the "claim to have read", I claim 3 of 5. So an identical portion of each group.

Another opinion. If you haven't read the Go4 patterns book, you don't deserve to call yourself a real software engineer -- and it's an eminently readable book, too. And if you haven't read the "The C++ Programming Language", you should not be behind the wheel of a C++ compiler. I haven't read Knuth nor the Algorithms book, but studied the topic extensively; you really need this kind of foundation to be a full-fledged software engineer.


> If you haven't read the Go4 patterns book, you don't deserve to call yourself a real software engineer

What do you find in this book that (1) you are unlikely to find elsewhere, and (2) which are critically important to be a decent software engineer? The way I see it, we don't even need to know OO to be a good programmer. (By the way, is there a difference between "programmer" and "software engineer?)

Disclaimer: my current opinion about OO is that it mostly sucks. Alas, I don't know if it is because I know too little or too much of it.


> my current opinion about OO is that it mostly sucks. Alas, I don't know if it is because I know too little or too much of it

Unfortunately OOP won, in the face of all alternatives, so either you like it or not, you're going to need it.

As to why OOP has won ... as with imperative programming in general, it's easier to wrap your head around it without much theoretical background. I have trouble seeing a 10 year-old learning category theory to be able to read/write files in Haskell (contrary to popular beliefs, you do need lots of knowledge when wanting to combine monads).

It's all about polymorphism, which enables composability / reuse / decoupling.

In OOP polymorphism is natural. In Haskel, the only static / functional language where polymorphism is done right, the learning curve is quite high.

Languages from the ML family are very suitable for symbolic processing (theorem proving, compilers), but OOP is versatile and can be used efficiently on a whole range of problems ... including writing compilers ... http://tinlizzie.org/ometa/

You might have been burnt by the static OOP languages, trouble is OOP mixes with static typing like oil and water ... take a look at Smalltalk or at CLOS sometimes. CLOS is even more capable as it supports things like multi-dispatching.


> You might have been burnt by the static OOP languages…

I have. This is crushing. And the fact that they won't die any time soon makes me feel worse. One of my colleague even said to me with a straight face that "serious" programming couldn't be done but in C++ (if only he had omited the "but").

I think the problem of Haskell isn't it's learning curve. It's where you have to start from: scratch. Someone who know C, Java and Python won't be able to use much of their knowledge to learn Haskell. On the other hand, Haskell could be taught as a first programming course. (Like Ocaml was in my case).

> including writing compilers ... http://tinlizzie.org/ometa/

Is this still OO?? The core language is based on pattern-matching! The way I see it, the OO part has been pushed to the periphery. If I do parsing in OMeta, I doubt I could claim I did it in an OO way.


What do you find in this book that (1) you are unlikely to find elsewhere, and (2) which are critically important to be a decent software engineer?

I see two important benefits from Patterns:

1. As a (partial) replacement for experience. I've been doing this for 20+ years, and I'd encountered most of it at some point before reading the book, so it's not essential in this respect. But someone just starting out could assimilate directly a chunk of what I had to figure out on my own. And even for someone with more experience, it standardizes the details of the pattern, leading to greater consistency in the software.

2. To provide a common vocabulary amongst developers. When discussing a design with someone who's read the book, I can say "I think we can solve that by using the strategy pattern", and they'll know what I mean. Without this common vocabulary we'd spend a lot more time explaining, and at greater risk of being misunderstood.

is there a difference between "programmer" and "software engineer?

This is just my philosophy, but I think that computer programming in itself is a simple activity, you could teach 'most anybody to write a program -- and that's the reason that so much software sucks. When done correctly it's an engineering discipline, including analysis, modeling, planning, documenting, and a certain degree of programming. But the actual implementation of code is a minority of the job.

The way I see it, we don't even need to know OO to be a good programmer. ... my current opinion about OO is that it mostly sucks

Given my definition above, I'd have to agree that you don't need to know or do OO in order to be a programmer. But if you aspire to the fuller job of software engineering, it would be foolish to exclude such an important tool from your repertoire.

Regardless of its actual merits, it's objectively true that a huge portion (I'd think even a majority) of development tools (including platforms, languages, IDEs, frameworks, libraries, etc.) are geared toward OO development. Eschewing those means that you're forgoing much of the foundational stuff that our predecessors have built for us (standing on the shoulders of giants and all).

I'll grant that the industry in the '90s may have been a bit manic about OO. Since then we've learned that the paradigm has weaknesses and indeed flaws. But we do generally know what those are, and have found ways to work around them. We understand now that C++ is (insanely) complex and rigid, and modern OO manages to retain most of the benefits while shedding those problems. The contemporary dynamic languages build on a foundation of OO development while avoiding many of its pitfalls (at the low level at least). And what I think of as the cutting edge, the dynamic languages, seem to have found a way to deliver their benefit while generally coexisting with the OO paradigm. Which, of course, ties back to what I said earlier about keeping your toolbox full for whatever can best solve a problem.


OK, that makes sense. Thank you.

Now, there is still something that bothers me. OO is obviously very important. However, network effects look like they play a huge part. I am a perfectionist. As such, I wouldn't like to settle for a local maximum. For example I feel that IDE aside, functional programming with an advanced type system is better than OO for most purposes (ML and Haskell fit this pet paradigm of mine).

The problem is that I fail to see how OO could be important by itself. (Like I fail to see how Windows could survive GNU/Linux in a world where every programs and drivers ship on both.) As anecdotal evidence, I can't solve a quite universal problem: the representation of optional data. With inductive types (also called sum types, or algebraic data types), this is easy:

    -- Type definition
    data Option a = Nothing
                  | Something a
    
    -- Example of use
    case computation-that-may-fail
      Nothing     -> "I failed"
      Something x -> "I succeeded. Result: " ++ show x
I tried, even asked, to do this in an OO way. No luck so far. So, until I find an acceptable solution, I will doubt OO is best for, well, nearly all purposes. (Note that by "not best", I do not mean "bad". I just mean we can do better.)


It's clear that you're trying to solve the problem in a functional way, and not in the way that an alternate paradigm would lend itself to. Indeed, elsewhere you admit

Someone who know C, Java and Python won't be able to use much of their knowledge to learn Haskell. On the other hand, Haskell could be taught as a first programming course. (Like Ocaml was in my case).

I think you're falling prey to the reverse of this problem: your mind is stuck in functional-land, and you need to switch your mode of thinking for success in an OO setting.

The approach to this is clunky in C++, Java, or C#. The normal approach is for the "parent" object to contain a collection of its optional attributes. One would insert new attributes into this collection, and request their values from there later. Another approach you'll see, depending on the needs, is to use generics or templates to package the attribute with sidecar information that indicates whether it's actually "there", as with C# nullable types.

In newer dynamic languages like python (which is an evolution of the OO paradigm), though, this is simplicity itself. Indeed, it's at the heart of dynamic programming, and is what I referred to in my previous post when I criticized the rigidity of C++. In python, one isn't bound by the static definitions of an interface. If you want to add an additional attribute to your object, well, what are waiting for? Just go and add that attribute to it. Later on, use duck typing to handle the presence or absence as necessary.


> your mind is stuck in functional-land

Most probably. But there's a reason for that: I don't like mutable state[1]. And I'm not sure you can try to avoid mutable state and still do OO.

[1]: http://www.loup-vaillant.fr/articles/assignment


It depends on your definition of OO, because there isn't "a" definition of OO. Typeclasses in Haskell aren't classes, as I well know, but on the other hand it is true that they do fill in for some of the roles that interfaces do in OO languages. Fundamentally, what is an object? A collection of data and associated operations as an atomic unit. Typeclasses do fit that role. Another way to do "OO" in Haskell is:

    data Animal = Animal { name :: String,
                           is_awake :: Time -> Bool,
                           lives_in :: Environment -> Bool }
and so on, with the "methods" actually being bound at the time of creation of the Animal. See: http://lukepalmer.wordpress.com/2010/01/24/haskell-antipatte...

Other OO things could be built as well in a perfectly functional way, picking and choosing the bits of the definition of OO you want. What you can't do is choose exactly what C++ or Java gives you, and what you don't get is a Blessed Object Orientation Technique like you do with those languages. But you can program OO just as you can in C. (Only better in most ways. Note you don't get a BOOT in C either, but there are nevertheless OO C programs.)

Mutability is merely one dimension of OO, a term that is so flexible it basically means nothing without further qualification. OO has its place even in a functional program sometimes, as that link shows.


My definition of OO is roughly the one given here: http://www.info.ucl.ac.be/~pvr/paradigms.html Meaning, doing OO is basically using closures and mutable state (let's say objects are a form of closure). I don't like mutable state, therefore I don't like OO as defined above.

We can also define OO as the use of inheritance. The problem with inheritance is that it spurs the violation of decoupling. More often than not, derived classes are tightly coupled to their base class. I don't like that. This paper also suggest that inheritance is not very good, because it would increase the error rate by 6: http://www.leshatton.org/Documents/OO_IS698.pdf (Note that I don't take this paper as a proof that OO is bad in general. Rather, it strongly suggest that C++ without templates nor the STL, used in an OO fashion, is worse than plain C).

Now, if you forbid (or severely limit) both mutable state and inheritance, I really don't see what is left to OO. We could see you `Animal` data type and the Ocaml module system as forms of OO, but at this point, "OO" would mean anything (and therefore be meaningless).

(Note: when I say "I don't like X", I mean I will avoid X as long as the resulting solution isn't demonstrably simpler.)


> And I'm not sure you can try to avoid mutable state and still do OO.

Check out Erlang's parametrized modules, a form of functional OO.

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.58....


If you haven't read the Go4 patterns book, you don't deserve to call yourself a real software engineer

I read it, hated it, and would never recommend it for a few reasons, but mostly because I already knew Common Lisp when I read it (though I was writing Java full-time in my day job at the time) and so each time I read a new pattern I would think about how it was usually just an ugly hack to work around the poor design of a language like Java.

You might argue that a "real software engineer" will have to use crappy languages at some point and so should know the hacks to compensate. I'd argue that knowledge of a bunch of different languages makes those hacks obvious anyway - so I'd tell people to just spend time learning lots of languages instead of reading the patterns book.


The problem with GoF (haven't read it cover-to-cover, but have browsed it for inspiration) is that it is more or less an encyclopedia. It lists the discrete patterns, gives them names, as if they've come down the mountain on stone tablets.

I found a book like Head First Design Patterns to be much more bottom-up. It hammers home the idea that discrete design patterns really are just the organic side-effects of applying concepts (meta-patterns?) like "composition over inheritance", "don't repeat yourself", etc., to recurring problems in computer science.


Sorry, I just can't help but observe the irony in something called head first being described as bottom-up.


Regarding Go4, Norvig suggests and I agree that patterns in that sense are a sign of problems in the language you are using.


This is an old post, and I've never quite agreed with it.

For one thing, I don't know how you can lump things like Head First Design Patterns or whatever the TDD one is in with things like Code Complete or Programming Pearls.

For another thing, I have read at least parts of all of the books that apparently programmers don't really read, and judging from the books on colleagues' desks (not the bookshelves) so have a lot of other people I've worked with. The same is certainly not true of the "books programmers do read" list.


I have never claimed I read "The C++ Programming Language" ;-)

After learning OOP with Smalltalk, it seemed a step in the wrong direction.

C is fine though. And TCPL is a fine book too.

I also never managed to finish "Code Complete".

To be fair, I also had exposure to Cincom's Mantis. That further spoiled me. To the point I still want a 3278 for my living room.

The books you end up reading are consequences of the steps you take in your career.


I don't understand why Mantis isn't better known in the programming/CS world. Then again, I'm also a former Cincomer.


I never heard of it, why not write about it? Anyway, if this[0] page is related to your mantis, obscurity is surely connected to the usage of a <marquee> tag ;)

[0] http://appdev.cincom.com/


That's a truly ancient web page!


I have a hard time believing many people have read Code Complete cover to cover, unless perhaps they're older and it was a long time ago before that was much better to do. It's so dry that I can't get more than a couple of chapters without falling asleep.


Your comment illustrates a problem with the original submission: It's idiosyncratic. It says more about the writer than the subject.

The various books are really different, people's reading strategies are different, people's tastes are different. People who have been in the field longer tend to have read more books. Also, whether or not you have "finished" a book tends to be irrelevant in nonfiction. If you can't get through Code Complete in one sitting, try reading it in bits and pieces - it shows signs of having been designed for that. If you haven't "finished" TAoCP... you and Knuth have something in common.

(For the record, I have read all of Code Complete at none time or another, but I've barely touched TAoCP. Which is not a problem. It is good to have things to look forward to. As someone once said: What's the point of having a collection of nothing but things you have already read?)


I'm older, it was a long time ago, and I read it cover to cover. The weird thing is I really liked it when I read it, but I'm not sure I could finish it today.


A month or two during the daily 20 minute commute on the train will do it


You can safely add "Goedel, Escher and Bach" to that list.


The author's point, summarized, is "Don't recommend books to new programmers that you haven't read yourself". My recommendation is to go read those books that you recommend. Even if you read them after you recommend them.

Did you ever see that stack overflow post which says in dozens of ways that you can't fully take apart (parse) HTML with regular expressions? If not, you should. It is informative and entertaining.

Don't you think that if those who attempt to separate HTML with regular expressions had read and understood either of the dragon books that they would even try the task with RE? I don't. Once you put something like parsing and code generation in your personal toolbox, you would be surprised at the tasks before you that are a lot less intimidating.

I agree with Thomas about TAOCP. I'll give you a particular example. Volume 2, "Seminumerical algorithms" has a description of how to do long division. It turns out that we needed to do that when shipping a C compiler for 8086/80286 targets that didn't have an '87. And there is a reference to it in Coders at Work about that particular rare case that turns up only seldom.

Have you read the chapter on random number generation? How he made up this convoluted procedure to use one number to go to a page in a reference book to pick another number, and eventually ended up with a cycle of numbers that was disasterously small? It is fun.

And having an original edition and the recently issued boxed set, it is fun to see some of the problems change from HM 50 to something else. For example Fermats Last Theorem.

The lesson from that is repeated in people writing and using crypto. Just ask Tom.

While there are downsides to a university education, as often pointed out here on hacker news, the experience take you, if you are willing, to places that you wouldn't have on your own.

Similarly, the problem in front of you might not automatically make you run to get the dragon book or the others on the list, perhaps a friend can twist your arm to get you to read it.

You don't always know what you need ahead of time. There is a common word for that here, I think.


It doesn't matter if you don't read dense programming books straight through or do all the exercises, as long as you are using them. If you make almost any effort to flip through looking for something new to chew on, you're usually rewarded pretty quickly with the listed selections, and that's why they're good books. The practice aspect is a good pursuit as well, but not intrinsic to the value of the book.

Edit: And I should add - that's why they're still valuable even in the Internet era. The "pure compilation of awesome content" aspect is a compelling reason to have them.


TC++PL is written to be read cover to cover, with the exception of Part III (The Standard Library,) and even in that part, each chapter is written to be read beginning to end. C++ is a complicated and dangerous language. The complexity and length of TC++PL is dictated by the complexity of understanding required to be a decent C++ programmer. Avoiding C++ entirely is a great idea if you can manage it, but programming in C++ without investing the time to read TC++PL (or another similarly detailed book) is a really bad idea.


Or, _Projecting_.


Whatever happened to the New Turing Omnibus?


There's one on my bookshelf, unread of course.


Of these, I have read, at least a significant portion of:

    - C Programming Language (2nd Edition)
    - Refactoring: Improving the Design of Existing Code
    - The Mythical Man-Month
    - Programming Pearls
The last two were due to one undergraduate CS professor assigning them.

If you feel you've almost gotten OO, but not quite, then Refactoring is a great book. That one lit up the "ah ha" lightbulb for me.


I agree with the comments about GEB - I've had a battered copy for over 20 years and I still haven't made it all the way to the end.


I read Design Patterns (GoF) and actually tried to write a text editor the way it describes it and found that it made selection really really difficult, so I had to find my own solution.

But I really grew fond of the Visitor Pattern, because you can do a lot with very little code.

Computer Graphics is the best school for Object Oriented thinking, I've found.


That's funny, I tried using the Visitor Pattern very heavily for a while, and ended up deciding it wasn't worth the trouble. Using it made me feel very virtuous and proper for a few years, but I've rarely if ever used it in the decade since, and never regretted not using it.


I didn't quite "get" the visitor pattern until taking the time to learn CLOS and multiple dispatch.


I don't understand this at all. To me, this reads as "Here are books that have been recommended to me that I haven't read. Nobody reads these." He implicitly says this himself, when he states "I've read all of these books myself, so I have no difficulty believing that many moderately competent programmers have read them as well."


No, that's not the same thing at all.


Posted more than a year ago: http://news.ycombinator.com/item?id=397996

(The URLs look identical... I wonder how this made it through the dupe filter?)


Is it just me or has HN gotten really cynical today?


This is remarkably annoying.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: