top | item 865679

Parallelism /= Concurrency

36 points| maxtilford | 16 years ago |ghcmutterings.wordpress.com | reply

15 comments

order
[+] alain94040|16 years ago|reply
This article has a few flaws.

First, I disagree with the definition of parallelism. So the whole argument becomes moot to begin with.

Second, the author conveniently ignores that languages with side-effects can also be automatically parallelized by their compiler, which achieves the same effect as the parallelism described for functional languages. Which makes the point of the article moot again.

In conclusion (to me at least), this article adds to the confusion rather than offer the clarity it was aiming for.

[+] mbrubeck|16 years ago|reply
You may "disagree" with this definition, but it's one that is commonly used by computer scientists. So it's a definition that is useful to know if you plan on doing or studying any research in this field. For example, see section 4.3 of this paper by Peter van Roy (author of the well-known Concepts, Techniques, and Models of Computer Programming):

http://lambda-the-ultimate.org/node/3465

Here's van Roy's definition:

"Concurrency should not be confused with parallelism. Concurrency is a language concept and parallelism is a hardware concept. Two parts are parallel if they execute simultaneously on multiple processors. Concurrency and parallelism are orthogonal: it is possible to run concurrent programs on a single processor (using preemptive scheduling and time slices) and to run sequential programs on multiple processors (by parallelizing the calculations)."

And no, languages with side-effects cannot be automatically parallelized without changing the semantics and introducing observable nondeterminism - not nearly to the same extent as pure languages. The examples I've seen essentially work for side-effect-free subsets, which kind of misses the point. Your C compiler may be able to vectorize a loop of arithmetic operations, but as soon as your code includes an external function call the compiler has know way of knowing whether that function performs I/O or modifies global variables, so it cannot safely make parallel calls to that function.

[+] artsrc|16 years ago|reply
What is your information on the performance of code produced by automatically parallel compilers?

I understand that attempt to automatically parallelize standard programming languages achieves some but not much improvement. The numbers I recall are less than 2 times speedup. Which I think I heard in an interview of Joe Armstrong on Software Engineering Radio, but I am not sure. Or perhaps it was an presentation by Guy Steele on parallel algorithms. Joe's message was that parallelism needs to be explicit. One of Guy's was that you need to use different algorithms allow parallel rather than serial processing.

I think we agree, that we expect to get more than (2 * one core speed) on our 10,000 core CPUs of the future.

[+] akeefer|16 years ago|reply
Whatever terminology you'd like to choose, I think it's still a useful distinction to be aware of and to consider. To rephrase the article . . . there are two reasons to execute two things in parallel. 1) To reduce latency by executing two operations at the same time instead of in serial, as in processing web server requests. 2) To increase the speed of execution of a single operation by utilizing multiple processor cores at once.

There's a ton of overlap as far as techniques (threads, monitors, locks, actors, and so on), but those two high-level goals have very different optimal strategies, and different languages and programming paradigms have different strengths and weaknesses relative to those two. Concurrency is generally more of a high-level consequence of the core algorithm you're using; parallelism is a low-level implementation detail around how a given function or process completes. Concurrency isn't generally going to be retrofitted automatically by the compiler; parallelism can be deduced either statically or at runtime or explicitly signaled by the programmer.

[+] maxtilford|16 years ago|reply
I'm curious. How would you define parallelism?