Hacker News new | past | comments | ask | show | jobs | submit login

I know you weren't aiming for maximum accuracy in your definitions, but I think this is worth bringing up.

Parallelism and Concurrency are not mutually exclusive terms. All parallelism is concurrent.

Concurrency is whenever more than one thing is happening at the same time conceptually. Everything from iterators to threads.

Parallelism is whenever more than one thing is happening at the same time physically. From a user-space perspective, this means threads or a coprocessor (like a GPU).




Yes. The intuition is a good one of parallelism being a deliberate way of designing a system to run its parts simultaneously (physically), with concurrency being a property of a system that may or may not have its parts running simultaneously (conceptually, 'overlapping', whether physically or logically).

There are many, many ways to think about this difference, and it's fun (and beneficial!) to do so every once in a while.

> From a user-space perspective, this means threads

Interestingly enough, before SMP and multicore, thread-level "parallelism" was actually disguised by a time-sharing concurrent implementation. That remains true for most threading libraries in languages with a GIL, and any time you have threads exceeding the number of physical cores. In fact, the primary purpose of threads was (and still mostly is) to get a semblance of concurrency... Even the Erlang implementation was single-threaded 2008 or so.


Well obviously threads aren't always running at the same time since each cpu can do only one thing at a time (or a finite number, if we're counting hardware threads) and you can have more threads than cpus.

It's the fact that they might run at the same time. Also from the perspective of the programmer, there's no difference between two threads running at the same time or by time sharing, since preemptive multitasking is non-deterministic and has the same implications as true parallelism.


I think we're talking about different things? You seem to be referring to parallelism in the literal, general sense of the word, while I'm referring to computational parallelism. Basically, if Amdahl's Law doesn't apply, then you're looking at concurrency. So this statement

> from the perspective of the programmer, there's no difference between two threads running at the same time or by time sharing

is incorrect when you're talking about computational parallelism (as OP was), because you're not going to realize any speedups with time sharing. In that case, you're using threads as a concurrency mechanism -- not for parallelism.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: