Programming

Clojure Exchange 2013: Tommy Hall on Concurrency versus Parallelism

One of the really interesting talks at Clojure Exchange 2013 was one by Tommy Hall with the (not good) title You came for the concurrency, right?.

The talk had two main threads. It served as a helpful review of Clojure’s state-handling, concurrency and parallel processing features. Useful for beginners but also a helpful recap for the more experienced.

The other was a discussion of what we mean by concurrency and parallelism. Something that I hadn’t really thought about before (although I was aware there was a difference). Tommy referenced this talk Concurrency is not Parallelism (it’s better) by Rob Pike.

In the talk Rob gives the following definitions:

Concurrency: programming as the composition of independently executing processes

Parallelism: programming as the simultaneous execution of (possibly related) computations

In the talk Tommy gives the future function as an example of programming to indicate concurrency boundaries and the reducers library as an example of parallelism.

Rob’s presentation is worth reading in full but his conclusion is that concurrency is not a guarantee of parallel execution but that achieving parallelism without concurrency is very hard or impossible. Tommy’s talk uses an Erlang example to make the same point.

Don’t fear the monoid

While discussing reducers Tommy finally explained something that I struggled before. A lot of type champions point at reducers and shout Monoids! As if that was some kind of argument.

During his talk Tommy explains that parallel combination function needs to be able to return an identity so the function always returns a value and that because ordering of values is not guaranteed (unlike the implied order in a regular reduce)  it needs to be associative.

That makes sense. Turns out that those are also the properties of a monoid.

Fans of type systems throw terms like Monad, Dual and Monoid not to help add understanding to a discussion but to use them as shibboleths. It was far more enlightening to see an example of where the needs of a problem were driving to a category of function with certain properties. If that is common enough to deserve a shorthand-name, fair enough, but the name itself is not magical and knowing the various function category names is a feat of learning rather akin to memorising all those software pattern names from the Gang of Four’s book.

Standard

3 thoughts on “Clojure Exchange 2013: Tommy Hall on Concurrency versus Parallelism

  1. Hey Rob,

    Thanks so much for the kind words and taking the time to write it up, I recall a clarifying interjection from you during it so cheers for the help at the time too.

    My goals for the talk were:
    * Refresher on reference types (which I think was poor and I would have been better shortening)
    * Give people a solid working definition of concurrency and parallelism and show our literature is a bit weak there (did OK)
    * Explain you don’t get parallelism without doing something (kind of got the idea across but wish I mentioned pmap and company, that slide got lost somehow in the panic of my laptop not working and going back to an older version)
    * Explain Reducers as a way of getting parallelism (and the idea reduce itself is no good for it)
    * Point people at people way smarter than me talking for a bit longer than I had about just one of the things I tried to
    Guy Steele – foldl and foldr Considered Slightly Harmful http://vimeo.com/6624203 http://www.thattommyhall.com/clojurex2013/ICFPAugust2009Steele.pdf
    Rob Pike (think your link above is wrong?) http://www.thattommyhall.com/2013/10/27/async-csp-resources/ http://vimeo.com/49718712 http://www.youtube.com/watch?v=f6kdp27TYZs
    Tim Baldridge on core.async’s go macro http://www.youtube.com/watch?v=R3PZMIwXN_g
    * Explain CSP and why it’s awesome for concurrency (and sometimes gives you parallelism)

    Slides are at http://www.thattommyhall.com/clojurex2013

    A point of clarification – really it’s the grouping of the combine function that does not matter, not the order (that would be commutative).

    Cheers again,
    Tom

  2. Thanks for the corrections, presentation link and function name updated. I did check your blog for the kind of summary you just posted here before I posted mine, but you didn’t have a link to the slides! Man!

Leave a comment