What does "as fundamental as function composition" mean here? The article just appears to describe transducers. Transducers compose via ("reverse") function composition, sure, but that just means that they are functions of a type... and considerably less fundamental than functions since they're a specialization of that class of things.
They're cool and all—I've characterized them (partially, perhaps) as CPS encoded flatmaps over lists and also as stateful left fold transformers, the latter of which being much more general—but they're more like a rich ecosystem for list transformations than any kind of fundamental abstraction.
This is typical of Closure, as a sort of anti-Haskell. It starts with empirical structures invented by folks without PL theory training, and then wriggles to find an explanation in terms of standard theory. You can see this here on HN when Rich talks to Haskell folks and people translate his writing into standard language.
I would love a comparison in the form of Haskell, converted to Lisp syntax, and wired up to JVM.
"fundamental as function composition" is obviously subjective, but I think the point is that it seems you can make a fairly strong argument that in lisps, there is a question of what does the form (map f) eval to, and transducers put forth an answer that seems to be logically correct and superior to any alternatives. Hence, it seems fundamental.
I think "as fundamental as function composition" means that the relationship between a function like map and a transducer (map f) is fundamental in some sense that resembles function composition. But it isn't composition: it's something that "wrangles" the "reduce kernel" out of the combination of map and f: when we map something using f, what function instead of f will do the same mapping under reduce? That function inherits logic from map, and from f, but it's not a composition of the two.
Function composition usually operates at non-collection type inputs. If you compose f and g, that usually means f takes some input and then g takes as input the output of g.
Transducers are a generalization of composition in the sense that it takes a collection of those inputs f would accepts.
So, in that sense, function composition could be a specific instance of transducers with number of inputs = 1.
Re: the characterisation, the only thing I'd add is that they have explicit start and end calls. Start resets the state, end can clean up resources if necessary.
I tried thinking about what output structure a generalised transducer would form, and came to the conclusion it was a foldable monoid i.e. pretty list-like.
Title shows author doesn't understand programming language design. No, a small set of higher-order functions[1] is not as fundamental as the concept of higher-order functions in the first place.
Author here, thanks for the comment -- since that seemed to have been lost, I'm using "fundamental" because transducers let you describe your logic so that it that can be applied over sequences and non-sequences in a way that you cannot by just applying composed logic functions through existing clojure machinery.
I think Rich's innovation here is extremely clever and quite subtle, but it's pretty Clojure-specific, both in terms of the problem it solves and the way it solves it.
The overall pattern of map-reduce is possible in most languages, certainly any language that has closures. And the idea of composing multiple reducing functions together is something that comes up in many languages. If a language does not have closures, you could do something similar in any language by walking an accumulating object through multiple loops, but that is awkward and ugly. But you could certainly do something like this in Javascript, through the clever use of multiple closures, composing the closures together. But transducers formalize this as an idiomatic part of Clojure.
What's clever is recognizing the "kernel" of a function like map. Hickey answers the question: if we take this list-processing/decimating function like "count-if" or "map" or whatever, and express it with reduce, what is that reducer function which we will need? And how is that reducer derived from or related to the function that goes into the list processing function, like the mapping function in map? He then makes those functions behave as methods (when called with one less argument) which give you that reducer.
Now what if this is done to reduce itself? What should a reduced arity (reduce fun) return? I think that is a noop: nothing needs to be done to fun to make it reduce under reduce, so (reduce fun) -> fun. I.e. to transduce the reducer "fun", an arbitrary function, into a function X such that (reduce X list) will do the job of (reduce fun list), we simply equate X with fun.
What is the difference (or what is gained) from transducing a reducer over mapping (filtering, etc) a list and reducing it? Is it a clojure-specific optimisation?
It avoids intermediate structure and enables more sources and sinks to work. For instance, if you build a reducer, you're basically adjoining a "reducible" and a "reduction function" and then transforming the reducer by transforming that reduction function.
This already avoids the creation of intermediate structure since you just keep transforming the reduction function, but you have this sort of useless "reducible" thing attached. Mostly, the trouble is that you were afflicted by the kingdom of nouns---you don't really need a structure to think of first class objects.
Instead, you can just consider the various ways of transforming reduction functions. They all compose as (reverse) functions (you can see them as a category) and you can take your resultant "transducer" and apply it to a source and sink structure to map out of the source and into the sink.
In addition to map transducers (which are 1-to-1) and filter transducers (which are many-to-1), flatMap transducers (which are 1-to-many) should be fundamental.
flatMap is the fundamental transducer (of a particular model). To be clear, the function a -> [b] subsumes mapping and filtering---if [b] is always a single element then a -> [b] is a map, if [b] is always either 0-or-1 elements then a -> [b] is a filter (possibly adjoined to a map).
[+] [-] tel|11 years ago|reply
They're cool and all—I've characterized them (partially, perhaps) as CPS encoded flatmaps over lists and also as stateful left fold transformers, the latter of which being much more general—but they're more like a rich ecosystem for list transformations than any kind of fundamental abstraction.
[+] [-] judk|11 years ago|reply
I would love a comparison in the form of Haskell, converted to Lisp syntax, and wired up to JVM.
[+] [-] gfodor|11 years ago|reply
[+] [-] kazinator|11 years ago|reply
[+] [-] ludwigvan|11 years ago|reply
Function composition usually operates at non-collection type inputs. If you compose f and g, that usually means f takes some input and then g takes as input the output of g.
Transducers are a generalization of composition in the sense that it takes a collection of those inputs f would accepts.
So, in that sense, function composition could be a specific instance of transducers with number of inputs = 1.
[+] [-] moomin|11 years ago|reply
Re: the characterisation, the only thing I'd add is that they have explicit start and end calls. Start resets the state, end can clean up resources if necessary.
I tried thinking about what output structure a generalised transducer would form, and came to the conclusion it was a foldable monoid i.e. pretty list-like.
[+] [-] dons|11 years ago|reply
[1]: http://www.reddit.com/r/haskell/comments/2cv6l4/clojures_tra...
[+] [-] ithayer|11 years ago|reply
EDIT: s/composed functions/composed logic functions/
[+] [-] moomin|11 years ago|reply
I think Rich's innovation here is extremely clever and quite subtle, but it's pretty Clojure-specific, both in terms of the problem it solves and the way it solves it.
[+] [-] lkrubner|11 years ago|reply
[+] [-] kazinator|11 years ago|reply
Now what if this is done to reduce itself? What should a reduced arity (reduce fun) return? I think that is a noop: nothing needs to be done to fun to make it reduce under reduce, so (reduce fun) -> fun. I.e. to transduce the reducer "fun", an arbitrary function, into a function X such that (reduce X list) will do the job of (reduce fun list), we simply equate X with fun.
[+] [-] dschiptsov|11 years ago|reply
Clojure's are more fundamental than other languages?
[+] [-] unknown|11 years ago|reply
[deleted]
[+] [-] anon4|11 years ago|reply
[+] [-] tel|11 years ago|reply
This already avoids the creation of intermediate structure since you just keep transforming the reduction function, but you have this sort of useless "reducible" thing attached. Mostly, the trouble is that you were afflicted by the kingdom of nouns---you don't really need a structure to think of first class objects.
Instead, you can just consider the various ways of transforming reduction functions. They all compose as (reverse) functions (you can see them as a category) and you can take your resultant "transducer" and apply it to a source and sink structure to map out of the source and into the sink.
[+] [-] rebcabin|11 years ago|reply
[+] [-] tel|11 years ago|reply