top | item 45435422

Category Theory Illustrated – Natural Transformations

217 points| boris_m | 5 months ago |abuseofnotation.github.io | reply

83 comments

order
[+] rck|5 months ago|reply
This is fun. But the bit at the beginning about philosophy is not correct. Parmenides did not believe in what we would call essences, but really did believe that nothing ever changes (along with his fellow Eliatic philosopher Zeno, of paradox fame). The idea that change is an illusion is pretty silly, and so Plato and especially Aristotle worked out what's wrong with that and proposed the idea of _forms_ in part to account for the nature of change. Aristotle extended Plato's idea and grounded it in material reality which we observe via the senses, and that's where the concept of essence really comes from - "essence" comes from the Latin "essentia" which was coined to deal with the tricky Greek οὐσία (ousia - "being") that Aristotle uses in his discussions of change.
[+] griffzhowl|5 months ago|reply
One way I've seen it presented is that the early Greek philosophers were grappling with how to reconcile two basic facts: somethings stay the same (constancy or regularity), and some things change.

Heraclitus was before Parmenides and said that everything changes. Parmenides said that nothing changes, and then the atomists, most prominently Democritus, synthesised these two points of view by saying that there are atoms which don't change, but all apparent change is explained by the relative motions of the different basic atoms. Plato was influenced by all of these. But I would say the theory of forms accounts more for constancy or regularity more than change, no?

Btw, the central concept of Parmenides' philosophy is always translated as "Being", but I couldn't find the original Greek word. It isn't "ousia"?

[+] gcr|5 months ago|reply
Anyone who likes this might also like Stefan Miller’s paper, “a simple category theoretical understanding of category theory diagrams“, appearing in SIGBOVIK 2014. See https://sigbovik.org/2014/proceedings.pdf (starts on PDF page 65, or page 57 if you go by margin page numbers)
[+] hamburgererror|5 months ago|reply
What's the thing with category theory? I see this topic discussed quite frequently here but I don't get it why people are so into it
[+] tristramb|5 months ago|reply
Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe. At first this might seem a perverse thing to do as mappings seem more complex than sets, but that is just because traditionally mappings have usually been defined in terms of sets.

In set theory you can specify that two sets be equal and you can also specify that one set be an element of another.

In category theory you can specify that two mappings be equal and you can also specify that two mappings compose end to end to produce a third mapping.

Category theory can be used to express some requirements in a very concise way.

[+] shae|5 months ago|reply
At the 2018(?) ICFP, I sat between John Wiegley and Conal Elliot. They talked about expressing and solving a programming problem in category theory, and then mapping the solution into whatever programming language their employer was using. From what they said, they were having great success producing efficient and effective solutions following this process.

I decided to look for other cases where this process worked.

I found several, but one off the top of my head is high dimensional analysis, where t-SNE was doing okay, and a group decided to start with CT and try to build something better, and produced UMAP, which is much better.

In short, this does work, and you can find much better solutions this way.

(random link https://stats.stackexchange.com/questions/402668/intuitive-e... )

[+] tikhonj|5 months ago|reply
Category theory gives us a nice, high-level set of conceptual tools to try to understand and generalize over things that are hard to connect otherwise. Some people find that useful directly, other people just enjoy it for its own sake, or even for aesthetic reasons. (I think all three are totally reasonable!)

At the same time, it's actually rather more accessible than most other areas of pure math—at least at the level that people talk about it online. Basic category theory can be hard to learn because it's so abstract but, unlike almost any other are of math from the 20th century onwards, it has almost no hard prerequisites. You can reasonably learn about categories, functors, natural transformations and so on without needing a graduate degree's worth of math courses first. You might not understand the most common examples mathematicians use to illustrate category theory ideas—but it's such a general framework that it isn't hard to find alternate examples from computer science or physics or whatever else you already know. In fact, I expect most of the articles that get talked about here do exactly that: illustrate category theory ideas with CS/programming examples that folks on HN find relevant and accessible.

[+] T-R|5 months ago|reply
Abstract Algebra, looked at through the lens of Programming, is kind of "the study of good library interface design", because it describes different ways things can be "composable", like composing functions `A -> B` and `B -> C`, or operators like `A <> A -> A`, or nestable containers `C<C<T>> -> C<T>`, with laws clearly specifying how to ensure they don't break/break expectations for users, optimizers, etc. Ways where your output is in some sense the same as your input, so you can break down problems, and don't need to use different functions for each step.

Category Theory's approach of "don't do any introspection on the elements of the set" led it to focus on some structures that turned out to be particularly common and useful (functors, natural transformations, lenses, monads, etc.). Learning these is like learning about a new interface/protocol/API you can use/implement - it lets you write less code, use out-of-the-box tools, makes your code more general, and people can know how to use it without reading as much documentation.

Focusing on these also suggests a generally useful way to approach problems/structuring your code - rather than immediately introspecting your input and picking away at it, instead think about the structual patterns of the computation, and how you could model parts of it as transformations between different data structures/instances of well-known patterns.

As a down-to-earth example, if you need to schedule a bunch of work with some dependencies, rather than diving into hacking out a while-loop with a stack, instead model it as a DAG, decide on an order to traverse it (transform to a list), and define an `execute` function (fold/reduce). This means just importing a graph library (or just programming to an interface that the graph library implements) instead of spending your day debugging. People generally associate FP with recursion, but the preferred approach is to factor out the control flow entirely; CT suggests doing that by breaking it down into transformations between data structures/representations. It's hugely powerful, though you can also imagine that someone who's never seen a DAG might now be confused why you're importing a graph library in your code for running async jobs.

[+] jesuslop|5 months ago|reply
Its of course the theory behind monads that since Eugenio Moggi are used to model computational effects in pure functional languages. Effects such as state, optional return types (used in turn for error handling) (maybe monad), input/output (reader writer monad) and others. Beyond effects, Wadler used monads for parsers (monadic parsing).

The Curry-Howard "isomorphism" (slogan: propositions are types, proofs are programs/functions) map code to logic in a categorical way described first by certain book of Lambek-Scott with uses in formal software verification.

Categories provide abstraction. You first distill the behavior of how Haskell (or you other pet functional language) work with Hask, the category of Haskell types, and then you can apply your abstract distillate to other categories and obtain task-oriented, tailored computing concepts that enrich bare language capabilities, providing applications including 1) probabilistic programs 2) automatic differentiation. Conal Elliott has very concrete work along this lines. When he speaks of CCCs (following Lambek) he alludes to cartesian closed categories, the crucial property of having a type constructor for function spaces and higher order functions. See his "compiling to categories" for a very concrete, hands-on feel. Another application he shows is in hardware synthesis (baking your functional algorithm to a netlist of logical gates for automating the design of custom hw accelerators).

In short, why categories? computational effects, formal verification and the equivalence of simply-typed lambda-calculus with cartesian closed categories, with lambda-calculus being the backbone of functional programming language semantics.

[+] coderatlarge|5 months ago|reply
one of the things i took away about category theory is that it allows you to avoid repeating certain arguments which boil down to so-called “abstract nonsense” ie they have nothing to do with the specific objects you’re dealing with but rather are a consequence of very generic mapping relationships between them. maybe better versed people can give specifics.

as a very broad example there are multiple ways to define a “homology” (ex simplicial, singular, etc) functor associating certain groups to topological spaces as invariants. but the arguments needed to prove properties of the relationships between those groups can be derived from very general properties of the definitions and don’t need to be re-argued from the very fine definitions of each type of homology.

i think.

[+] chermi|5 months ago|reply
Pinging the https://planting.space/ people! I know some of them are on HN, at least recruiting in the past. I haven't updated my knowledge in a while, but my impression of the company was basically that they were going make money using category theory(+all math) in clever and useful ways. I think they turned toward AI a little, but they're at the root a bunch of people who think category theory is useful. Hence, the ping!
[+] Twey|5 months ago|reply
Category theory is popular in computer science because, at a fundamental level, they're very compatible ways of seeing the world.

In computing, we think about:

- a set of states

- with transformations between them

- including a ‘do nothing’ transformation

- that can be composed associatively (a sequence of statements `{a; b;}; c` transforms the state in the same way as a sequence of statements `a; {b; c;}`)

- but only in certain ways: some states are unreachable from other states

This is exactly the sort of thing category theory studies, so there's a lot of cross-pollination between the disciplines. Computation defines interesting properties of certain categories like ‘computation’ or ‘polynomial efficiency’ that can help category theorists track down interesting beasts to study in category theory and other branches of mathematics that have their own relationships to category theory. Meanwhile, category theory can give suggestions to computer science both about what sort of things the states and transformations can mean and also what the consequences are of defining them in different ways, i.e. how we can capture more expressive power or efficiency without straying too far from the comfort of our ‘do this then do that’ mental model.

This latter is really helpful in computer science, especially in programming language or API design, because in general it's a really hard problem to say, given a particular set of basic building blocks, what properties they'll have when combined in all the possible ways. Results in category theory usually look like that: given a set of building blocks of a particular form, you will always be able to compose them in such a way that the result has a desired property; or, no matter how they're combined, the result will never have a particular undesired property.

As an aside, it's common in a certain computer science subculture (mostly the one that likes category theory) to talk about computing in the language of typed functional programming, but if you don't already have a deep understanding of how functional programming represents computation this can hide the forest behind the trees: when a functional programmer says ‘type’ or ‘typing context’ you can think about sets of potential (sub)states of the computer.

[+] siddboots|5 months ago|reply
It's just a good set of models to use to think about all sorts of different mathematical systems, kind of like a unified vocabulary. Beyond undergraduate level, category theory these days plays a huge role within many vast fields - e.g., algebraic geometry, algebraic topology, or representation theory.
[+] monarchwadia|5 months ago|reply
For me.. It's a very useful mental model for thinking about architecture & logic
[+] esafak|5 months ago|reply
A lot of us don't get it and want to know what we're missing :)
[+] jiggunjer|5 months ago|reply
Lets software designers use fancy words and ask for a raise.
[+] michaelcampbell|5 months ago|reply
Had to reduce the page to 67% to get out of "Fisher Price" font size, but otherwise quite interesting.
[+] measurablefunc|5 months ago|reply
The natural transformation α : F ⇒ G is not specified properly b/c when expressed in compositional form you also have to specify the subscript for the natural transformation & it is an equality instead of an isomorphism, i.e. if f : a → e then αₑ ∘ Ff = Gf ∘ αₐ. There are highter categories where what he has written down can make sense but in the context of the current exposition it is not correct as written.
[+] mallowdram|5 months ago|reply
Isomorphism invariance applies to neural assemblies or syntax, not to mere symbols. The problem in math is it models. Brains do not model. Heraclitus was right if math never enters the picture to add its arbitrariness. "A man in the night kindles a light for himself when his sight is extinguished; living he is in contact with the dead when asleep, when awake he is in touch with the sleeper."
[+] sesm|5 months ago|reply
> In the course of this book, we learned that programming/computer science is the study of the category of types in programming languages.

This is a golden quote.

[+] ctenb|5 months ago|reply
It's also wrong, since computer science is traditionally mostly about computation, which has nothing to do with CT
[+] intalentive|5 months ago|reply
Interesting aside about the Vienna circle and isomorphism. I suspect that’s where Hayek got his idea that mind and representation are isomorphic, echoing Aristotle’s assertion in “On the Soul” / De Anima that the mind becomes the object of perception.
[+] YetAnotherNick|5 months ago|reply
It makes it so much more complicated than what is needed to understand natural transformation. Natural transformation is just mapping between two functors. You can discover the laws yourself just from this.
[+] ibobev|5 months ago|reply
The author is Jencel P.? I saved this book sometime ago under the author name Boris Marinov? Is this the same person now writing under a different pen name?
[+] larodi|5 months ago|reply
Is the same. He seems to obscure himself déliberately.
[+] w10-1|5 months ago|reply
I like it when teachers (e.g., Grant Sanderson) are careful to explain when they are trying to convey an intuition to motivate and guide some complex math, because it orients you without tangling you in all the misunderstanding that would come from extending analogies or cross-cultural/discipline comparisons too far.

But when authors start slinging around Plato and Aristotle and especially Parmenides willy-nilly alongside modern principles, they're waving a red flag... Don't get me started!

[+] emorning4|5 months ago|reply
>>Whatever is is, and what is not cannot be<<

Well, that's not true.

Particles, and thus facts, pop into and out of existence all the time.

[+] VirusNewbie|5 months ago|reply
The author uses adjoint functors to explain equivalence and naturality but doesn't actually call it that?
[+] lostbean|5 months ago|reply
I’m quite a visual learner, so I appreciate you sharing this.
[+] anal_reactor|5 months ago|reply
I hate this particular mix of prose and formalism. Too complicated to be pop-sci, too informal to be, well, formal. I got to this part:

> We know that two orders are isomorphic if there are two functors, such that going from one to the other and back again leads you to the same object.

And I have no clue what is a functor, nor order. "Functor" wasn't defined, and "order" is defined as "thin category", which in turn remains undefined.

Seems to me like in order to understand this text you already need to understand category theory. If that's the case, then why would you be reading it?

[+] mjh2539|5 months ago|reply
I agree. There was (and still is) a trend in technical writing that began in the 2010s to be overly pedestrian and informal (and in many cases explicitly vulgar). The same impulse or geist resulted in many people naming their library or product a cute or contrived or irrelevant word.

Get off my lawn! And start giving things long descriptive names that are aliased to acronyms again! db2, netcat, socat, emacs (editing macros), wget...etc.

[+] gs17|5 months ago|reply
It's the newest chapter of a book, the previous one defines functors.
[+] mrkeen|5 months ago|reply
> And I have no clue what is a functor, nor order.

If you press the Prev button at the top of the page it takes you back to Functors. Twice more and it will take you back to Orders.

[+] gnarlouse|5 months ago|reply
When I read this title I thought it was going to be rings and groups in bikinis. I’m so dumb.
[+] auggierose|5 months ago|reply
There are too many pictures in this for my taste. I am currently reading this one, and I like it better so far: https://doi.org/10.1142/13670
[+] IdontKnowRust|5 months ago|reply
It's author's intention, since the title explicitly says "Illustrated" =)