Even if you forget entirely about Clojure for a second, Rich has this very rare gift to take a complicated subject and make it easy for the audience to understand.
This is a double-edged sword. Forgive me a small anecdote: I once had a wonderful tutor who could explain any advanced concept in highly intuitive terms and it just made sense... until I got out of the classroom. At which point I'd just have this feeling of "Hang on, whaaaaaa...?". The first time I attempted the exam in this particular subject matter, I failed miserably (rightly so). Having learned my lesson, I went back to study the actual source material more thoroughly instead of mostly just listening to the probably-best-educator on the subject. That time I actually understood the material and passed with flying colours. (I still think the particular educator had a major role in me passing at all, but I digress.)
That that for what you will, but please be aware that a "gift for simplification" sometimes is a double-edged sword and can leave the audience thinking that they've understood when, actually, they haven't.
I'm not saying that's the case here, just something to be wary of.
So Transducers generalize the usage of Enumerable functions such as map, reduce, filter, flatMap (...).
Please could someone tell me if this is conceptually different from ruby's Enumerable module which only needs the class its included in to implement `each` so anything can be enumerable ? Or is it a similar but just in translated to the FP world ?
1. Transducers are parallel under the covers. Since the expectations is that the code that the predicate or mapping functions that you pass into map, filter and reduce are pure (no variables are changed, no state is modified, just a calculation that is only dependent on arguments) the parallelism is hidden away from you, but it's there. Ruby's Enumerable can't do parallelism
2. When you compose transducers, there are no intermediate sequences generated. The simplest example is this:
In Ruby:
[1, 2, 3].map {|x| x + 1}.map {|x| x+ 2} will generate an intermediate array [2, 3, 4] after the first map, and then will generate [4, 5, 6] when it has evaluated the whole expression.
In Clojure (I am sure I got the syntax wrong for this one)
(transduce (map #(+ 2 %) (map inc)) [1 2 3]) will create an intermediate mapping function that will first increment, then add 2 to its argument, and then will map once using that intermediate function.
Transducers are a different kind of thing, not just a generalized usage of enumerable functions.
Clojure transducers are isomorphic to functions of type `a -> [b]`, that is, functions which take some thing and return lists of other things. You can implement map and filter in this language:
tmap f a = [f a]
tfilter p a = if p a then [a] else []
You can also compose two such functions:
concat :: [[x]] -> [x]
concat [] = []
concat (x : xs) = x ++ concat xs
concatMap f list = concat (map f list)
compose lb_a lc_b = \a -> concatMap lc_b (lb_a a)
One reduction function (`foldr`) turns out to just be the identity function [a] -> [a], as I recall.
The interesting thing about Clojure transformers is that they have a strange continuation-passing form, such that this function `compose` that I wrote above for the type `a -> [b]` is actually in Clojure just a function composition. That is, instead of `a -> [b]` we see `forall r. (b -> r -> r) -> (a -> r -> r)`, which composes with normal function composition.
What's you're thinking of is more like ISeq in Clojure, or Foldable is Haskell. The more interesting generalisation is not between arrays, vectors and hashsets, but between data structures and event streams.
It's not that clear there will be one. Transducers have three steps: begin, during and end. In practice, start is only used for reduce operations and outside of reduce operations end can only be used for its side effects.
There's also definitional issues e.g. can you have an async transducer?
[+] [-] Blackthorn|11 years ago|reply
[+] [-] lomnakkus|11 years ago|reply
That that for what you will, but please be aware that a "gift for simplification" sometimes is a double-edged sword and can leave the audience thinking that they've understood when, actually, they haven't.
I'm not saying that's the case here, just something to be wary of.
[+] [-] ultimape|11 years ago|reply
[+] [-] jonahx|11 years ago|reply
"The reduce function is the base transformation; any other transformation can be expressed in terms of it (map, filter, etc)."
This seems so obvious in retrospect -- I can't believe I had never made that connection before.
[+] [-] kbeaty|11 years ago|reply
(Shameless plug in the hope it may be helpful to someone)
[1]: http://simplectic.com/blog/2014/transducers-explained-1/
[+] [-] nilliams|11 years ago|reply
[+] [-] ultimape|11 years ago|reply
- it was referenced in this great post detailing the transducer notion with Clojure, Scala, and Haskell: http://blog.podsnap.com/ducers2.html
[+] [-] charlysisto|11 years ago|reply
Please could someone tell me if this is conceptually different from ruby's Enumerable module which only needs the class its included in to implement `each` so anything can be enumerable ? Or is it a similar but just in translated to the FP world ?
[+] [-] YuriNiyazov|11 years ago|reply
1. Transducers are parallel under the covers. Since the expectations is that the code that the predicate or mapping functions that you pass into map, filter and reduce are pure (no variables are changed, no state is modified, just a calculation that is only dependent on arguments) the parallelism is hidden away from you, but it's there. Ruby's Enumerable can't do parallelism
2. When you compose transducers, there are no intermediate sequences generated. The simplest example is this:
In Ruby: [1, 2, 3].map {|x| x + 1}.map {|x| x+ 2} will generate an intermediate array [2, 3, 4] after the first map, and then will generate [4, 5, 6] when it has evaluated the whole expression.
In Clojure (I am sure I got the syntax wrong for this one) (transduce (map #(+ 2 %) (map inc)) [1 2 3]) will create an intermediate mapping function that will first increment, then add 2 to its argument, and then will map once using that intermediate function.
[+] [-] drostie|11 years ago|reply
Clojure transducers are isomorphic to functions of type `a -> [b]`, that is, functions which take some thing and return lists of other things. You can implement map and filter in this language:
You can also compose two such functions: One reduction function (`foldr`) turns out to just be the identity function [a] -> [a], as I recall.The interesting thing about Clojure transformers is that they have a strange continuation-passing form, such that this function `compose` that I wrote above for the type `a -> [b]` is actually in Clojure just a function composition. That is, instead of `a -> [b]` we see `forall r. (b -> r -> r) -> (a -> r -> r)`, which composes with normal function composition.
[+] [-] moomin|11 years ago|reply
[+] [-] nickik|11 years ago|reply
Transducers overall are quite intressting, I am exited to see what other transducer context people come up with.
[+] [-] moomin|11 years ago|reply
There's also definitional issues e.g. can you have an async transducer?
[+] [-] sitkack|11 years ago|reply
[+] [-] kushti|11 years ago|reply
[+] [-] vanderZwan|11 years ago|reply