top | item 45436935

(no title)

tristramb | 5 months ago

Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe. At first this might seem a perverse thing to do as mappings seem more complex than sets, but that is just because traditionally mappings have usually been defined in terms of sets.

In set theory you can specify that two sets be equal and you can also specify that one set be an element of another.

In category theory you can specify that two mappings be equal and you can also specify that two mappings compose end to end to produce a third mapping.

Category theory can be used to express some requirements in a very concise way.

discuss

order

griffzhowl|5 months ago

> Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe.

I'm not sure about that, because you still need some concept of set (or collection or class) to define a category, because you need a set of objects and mappings between them (technically that's a "small" category, but to define any larger category would require at least as much set-theoretical complication).

More exactly, whereas in set theory it's the membership relation between sets and their elements that is basic, in category theory it's the mapping between objects.

Nevertheless, the basic concepts of set theory can also be defined within category theory, so in that sense they're inter-translatable. In each case though, you need some ambient idea of a collection (or class or set) of the basic objects. Tom Leinster has a brilliantly clear and succinct (8 pages) exposition of how this is done here https://arxiv.org/abs/1212.6543

The thing is, even defining first-order logic requires a (potentially infinite) collection of variables and constant terms; and set theory is embedded in first-order logic, so both set theory and category theory are on the same footing in seemingly requiring a prior conception of some kind of potentially infinite "collection". To be honest I'm a bit puzzled as to how that works logically

housecarpenter|5 months ago

Defining first-order logic doesn't really require set theory, but it does require some conception of natural numbers. Instead of saying there's an infinite set of variables, you can have a single symbol x and a mark *, and then you can say a variable is any string consisting of x followed by any number of marks. You can do the same thing with constants, relation symbols and function symbols. This does mean there can only be countably many of each type of symbol but for foundational purposes that's fine since each proof will only mention finitely many variables.

Allowing uncountably many symbols can be more convenient when you apply logic in other ways, e.g. when doing model theory, but from a foundational perspective when you're doing stuff like that you're not using the "base" logic but rather using the formalized version of logic that you can define within the set theory that you defined using the base logic.

tristramb|5 months ago

So category theory is really the theory of composition of mappings. I conjecture that all programming can be seen as just the composition of mappings. If this is correct then category theory is a theory of programming.

sesm|5 months ago

You don't need category theory to connect dots with arrows, graph theory is enough for this.

ctenb|5 months ago

> you can also specify that two mappings compose

Two mappings with corresponding domain/codomain have to compose by definition of a category. It's not something you can specify.

ajkjk|5 months ago

That is probably what they mean by specifying that they compose.

If all you know is that you have two mappings you don't know they compose, until you get the additional information about their sources and targets. In a way that's what the source and targets are: just labels of what you can compose them with.

tristramb|5 months ago

Yes. When you are specifying a system you are building the category that you want it to live in.

stared|5 months ago

> Category theory can be used to express some requirements in a very concise way.

Could you give an example in programming, what can be easier expressed in CT than with sets and functions?

griffzhowl|5 months ago

It's more that category theory foregrounds the functions themselves, and their relationships, rather than on the elements of sets which the functions operate on. This higher-level perspective is arguably the more appropriate level when thinking about the structure of programs.

For more detail, see Bartosz Milewski, Category Theory for Programmers:

"Composition is at the very root of category theory — it’s part of the definition of the category itself. And I will argue strongly that composition is the essence of programming. We’ve been composing things forever, long before some great engineer came up with the idea of a subroutine. Some time ago the principles of structured programming revolutionized programming because they made blocks of code composable. Then came object oriented programming, which is all about composing objects. Functional programming is not only about composing functions and algebraic data structures — it makes concurrency composable — something that’s virtually impossible with other programming paradigms."

https://bartoszmilewski.com/2014/10/28/category-theory-for-p...

alfiedotwtf|5 months ago

> Category theory is what you get when you take mappings instead of sets as the primitive objects of your universe

Why have I never seen it explained like this before. Wow, thank you!

thomasahle|5 months ago

> Category theory can be used to express some requirements in a very concise way.

Can you an example?

tristramb|5 months ago

Take a mapping a and precompose it with the identity mapping i. By the definition of the identity mapping the resulting composition is equal to a.

  i;a = a
(Here ';' represents forward composition. Mathematicians tend to use backward composition represented by '∘' but I find backward composition awkward and error-prone and so avoid using it.)

Now, if there is another mapping j that is different from i, such that

  j;a = a
then the mapping a loses information. By this I mean that if you are given the value of a(x) you cannot always determine what x was. To understand this properly you may need to work through a simple example by drawing circles, dots and arrows on a piece of paper.

If there is no such j then mapping a is said to be a monomorphism or injection (the set theoretic term) and it does not lose information.

This specification of the property 'loses information' only involves mapping equality and mapping composition. It does not involve sets or elements of sets.

An example of a mapping that loses information would be the capitalization of strings of letters. An example of a mapping that you would not want to lose information would be zip file compression.

If you alter the above specification to use post-composition (a;i = a and a;j = a) instead of pre-composition you get epimorphisms or surjections which capture the idea that a mapping constrains all the values in its codomain. I like to think of this as the mapping does not return uninitialized values or 'junk' as it is sometimes called.

Bartosz Milewski works through this in more detail (including from the set-theoretic side) in the last 10 minutes of https://www.youtube.com/watch?v=O2lZkr-aAqk&list=PLbgaMIhjbm... and the first 10 minutes of https://www.youtube.com/watch?v=NcT7CGPICzo&list=PLbgaMIhjbm....