Inertia, mostly. All the common tools are built around C, all the existing code is written in C, all the jobs to maintain and extend that code are all in C.
Like most "Why does everyone only ever use language X to do Y" questions, considerations of actual technical merit / best technical fit are a very minor factor in the equation.
> but can you show me any demo where a FP language outperforms C for parallel tasks?
Modulo rewriting your C so that it does exactly what the FP language (runtime) does, such examples are not difficult to come by. Stalin, Stalingrad and Roger Fateman would be good starting points.
Regarding rewriting your C to match what the FP compiles into (or executes on the fly), the point is its very hard to write such _correct_ C unaided. So you use the higher level constructs to write that machine code / assembly / C.
I'm not sure this is a worthwhile comparison. Because the idiom of parallel programming is the same as the idiom for sequential programming in FP. It's completely different (and more difficult to make correct) in imperative languages. Which means it will be harder to maintain and harder to evolve even though it might be slightly faster in certain situations.
Well, this discussion started when someone suggested the speed increases fueled by Moore's law held FP back. This is what I've been questioning. If it's not about speed, then what does it have to do with Moore's law?
"but can you show me any demo where a FP language outperforms C for parallel tasks?"
That's not a fair comparison. C proponents should ask, "Is there a demo where (alternative) outperforms C for (application domain) when C has (alternative)'s safety-checks enabled, too?" Memory and dataflow safety in C, using checks instead of exotic stuff, easily adds 20-500% overhead depending on application. Whereas, these safer or more functional languages are usually within 200% with some Common LISP's and Schemes matching or outperforming C in a few benchmarks. Concurrent Haskell wouldn't be giving me a performance reduction: I'd be getting more capabilities in development and reliability with a performance sacrifice. One that goes down every year for at least one implementation of each language or style.
The question didn't even mention safety; maybe FP people are tying their own hands by insisting on safety features that users don't value. And if you're comparing N-way parallel FP code against serial C, maybe 500% overhead still leaves room for FP to win.
It's implied in the correctness argument. Is a speedup over C fine if it gives incorrect results? In that case, I'll demolish C myself by optimizing away any parts of the user program that dont map to fastest instructions and the registers. :P
Elephant in the room of having provable algorithms is in real numbers representation limitations in any imaginable hardware (hello irrational numbers!). You simply can't make complete proofs for most math-intense problems. Combinatorics should be fine though.
Huh? I might have missed your point but I'm talking about how C makes no effort toward correctness of results. The rest that do to varying degrees sacrifice some performance to get that. Can't speak on the proofs as it's not my specialty. Far as real numbers, that's what analog computers do with some models for making it more general purpose.
I was just going one step further due to my experience in proving correctness of algorithms in functional languages. You probably had on mind usual array safety checks and control flow stuff for which aspects were invented, I raised the bar a bit by pointing out that there are some fundamental issues which even super functional purists can't overcome, so the overall talk about code safety is kinda funny ;-) You will still crash Space Shuttle if you are running on Haskell, just in a different fashion than with C.
I've rarely seen what you're describing although I know plenty work remains go be done. So, could you give examples of code injections or mission-ending errors at code level that Haskell toolchains or Isabelle/HOL couldnt handle?
Huh? My impression is that functional has been growing massively over the last 5-10 years. OCaml has been around for 20 years but it's only recently that you hear people talking about it. The amount of Haskell or Scala hitting HN goes up and up - and that's matched by the increase in jobs in these languages.
Lisp (first formulated in 1959) is still alive and well: clojure.org
New languages (such as Javascript, Ruby, Python, even newer iterations of Java) become popular to the extent that they marry Algol-like (i.e. C-like) syntax with functional features derived from 1960s-era Lisp in an appealing fashion.