Hacker Newsnew | past | comments | ask | show | jobs | submit | macintux's commentslogin

“Everyone is special” is a snarky, derogatory comment we don’t need here.

It’s not snarky. It’s literally the argument people are making: I am special, my use case is exceptional, therefore I need to use the special tool, even if you don’t need to.

Technically Apple was never bankrupt. It certainly came within a few months, but never reached that point.

Some similarly-titled (but less tidily-presented) posts that have appeared on HN in the past, none of which generated any discussion:

* https://martynassubonis.substack.com/p/5-empirical-laws-of-s...

* https://newsletter.manager.dev/p/the-unwritten-laws-of-softw..., which linked to:

* https://newsletter.manager.dev/p/the-13-software-engineering...


Let me get this straight: he accused grieving families of lying about their children's deaths, and put a target on their backs, to sell stuff...and he didn't believe his own bullshit?

Yeah, especially because he contradicted himself several times, it doesn't seem like he believed it. He's really committed to trolling.

Labeling someone a "troll" for that level of evil feels enabling to me.

Saying that he actually believes his grift can make him seem kinda innocent, but I'm not going to assume that's the intent if someone says it.

Saying two different things at two different times is not prima face evidence of lying. For instance: changing your mind is not lying.

Again, I find it ironic that you're using the same conspiratorial line of thinking that Jones himself does: e.g. "The CDC has changed their mask guidance so they must be lying about it"


It was convenient timing for him to call the shootings "100% real" the moment he ended up in court for it.

People often do realize they were wrong about their prior convictions when they're sitting in a courtroom, a decade later, faced with a life changing situation, and forced to hear overwhelming evidence of how they were wrong.

If this guy was some kind of savant, he wouldn't have been in that situation at all. There is no evidence that he is some evil genius. He has no higher education and is a diagnosed narcissist.


> he was a performative comedian who complained about big government projects

That rather downplays the destructive impact he had on the families of Sandy Hook victims.


Indeed, that is the precise reason for this situation, right?

Exactly. If nothing else, writing a solver in Python or Java might take dozens or hundreds of lines more code than Prolog, so simply knowing what tools are best for what jobs helps you be a better developer, whether you're using a compiler or an agent.

Reminds me a bit of Bruce Tate’s approach in 7 languages in 7 weeks, which is where I first encountered Erlang.

I think from a historical perspective, describing COBOL and Fortran as part of the ALGOL family is a stretch, but I suppose it’s a good reminder that all history is reductive.


There's also (besides Tate's sequel of 7 more languages), Dimitry Zinoviev's 7 Obscure Languahes in Seven Weeks. I liked it a lot, even if it hurt my feelings a bit to have my beloved Forth be one of the obscure languages (the others were APL, SNOBOL, Occam, Simula, Starset, and M4) -- I'm old and nerdy, but hadn't even heard of Occam and Starset.

Perhaps better IMHO is "Strange code" by Ronald Kneusel (NoStarch, 2022) [0], which I found more didactic and developed. Please note than I'm quite a fan of this author's other books [1].

[0] https://nostarch.com/strange-code

[1] https://nostarch.com/search/Kneusel


I also think going back farther is a stretch. The first assembly languages were imperative, but what made Algol, Fortran, and Cobol interesting were functions and other features that allowed complex programming. Algol has the most descendants but Fortran was the first imperative programming language.

Does anybody know whether Fortran is older or younger than Algol? From Wikipedia, it looks like they were both developed around 1957. Was there any overlap in the design?

Algol was published in 1958, and FORTRAN in 1957. I think it's fair to say they were developed concurrently.

Rather COBOL is a living fossil? And today's Fortran is the FORTRAN family with horizontal gene transfer from the Algol lineage of programming languages.

Both languages have their standards updated still, latest year in both cases was 2023.

Fortran is one of the reasons OpenCL lost to CUDA, and now even AMD and Intel have finally Fortran support on their own stacks, not Khronos based.

https://developer.nvidia.com/cuda-fortran

Whereas Cobol, even has cloud and microservices.

https://www.rocketsoftware.com/en-us/products/cobol/visual-c...

https://aws.amazon.com/mainframe/

Incredible how being monetary relevant keeps some languages going.

Also note how the beloved UNIX and C are from 1971 - 73, only about 10 younger than COBOL.


> Fortran is one of the reasons OpenCL lost to CUDA, and now even AMD and Intel have finally Fortran support on their own stacks, not Khronos based.

FWIW, I loved using CUDA-Fortran. I think the ease of use of array variables maps very well with the way CUDA kernels work. It feels much more natural than in C++.


Can COBOL be called a living fossil?

I mean, programming languages do not live; and they do not "die", per se, either. Just the usage may go down towards 0.

COBOL would then be close to extinction. I think it only has a few niche places in the USA and perhaps a very few more areas, but I don't think it will survive for many more decades to come, whereas I think C or python will be around in, say, three decades still.

> family with horizontal gene transfer

Well, you refer here to biology; viruses are the most famous for horizontal gene transfer, transposons and plasmids too. But I don't think these terms apply to software that well. Code does not magically "transfer" and work, often you have to adjust to a particular architecture - that was one key reason why C became so dynamic. In biology you basically just have DNA, if we ignore RNA viruses (but they all need a cell for their own propagation) 4 states per slot in dsDNA (A, T, C, G; here I exclude RNA, but RNA is in many ways just like DNA, see reverse transcriptase, also found in viruses). So you don't have to translate much at all; some organisms use different codons (mitochondrial DNA has a few different codon tables) but by and large what works in organism A, works in organism B too, if you just look to, say, wish to create a protein. That's why "genetic engineering" is so simple, in principle: it just works if you put genes into different organisms (again, some details may be different but e. g. UUU would could for phenylalanine in most organisms; UUU is the mRNA variant of course, in dsDNA it would be TTT). Also, there is little to no "planning" when horizontal gene transfer happens, whereas porting requires thinking by a human. I don't feel that analogy works well at all.


I don't know; COBOL pops up in banking all the time and Fortran lives on in a lot of specialised engineering applications.

Is there anything to be gained from rewriting it in Rust? Five years ago would there have been anything to be gained from rewriting it in Haskell? Ten years ago would there have been anything to be gained from rewriting it in Ruby? Fifteen years ago would there have been anything to gain from rewriting it in Clojure? Twenty years ago would there have been anything to gain from rewriting it in Java?

And so on, all the way back.


Analogies are inherently reductive; by deeply scrutinizing the differences you do not prove the analogy is useless.

I'd rather ban comments complaining about it. Judge the content by its merits.

When I worked for Basho, aphyr was highly respected by some of the smartest people I’d ever worked with. Definitely no slouch.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: