top | item 43451748

(no title)

jmkr | 11 months ago

A symbol being arbitrary doesn't influence the reality of the meaning behind a thing. I've always thought about `zero` while counting, it never was about `0`.

I observe zero.

I don't think zero is an absence of quantity. I don't think zero is the null set.

You can write types in a programming language, but there are other type theory books that do include zero in the natural numbers. And type theory comes from number/set theory. So it's ok if you decide to exclude it, but this is just as arbitrary.

In fact I'd be happy to write `>=0` or `>0` or `=0` any day instead of mangling the idea of zero representing 0 and zero representing something like `None`, `null` or any other tag of that sort. I don't think the natural world has anything like "nothing" it just has logical fallacies.

discuss

order

nivertech|11 months ago

> I don't think zero is the null set.

zero is the cardinality of the empty set

> I observe zero.

it cannot be observed directly at any static point in time, but it can be observed as a dynamic process when some quantity goes down to empty and back up over time

> In fact I'd be happy to write `>=0` or `>0` or `=0` any day instead of mangling the idea of zero representing 0 and zero representing something like `None`, `null` or any other tag of that sort. I don't think the natural world has anything like "nothing" it just has logical fallacies.

N, W, R, etc. - r just well-known names for sets of numbers, nothing stops us from defining better or additional names for them (with self-describing names)

We can discuss Empty type[1] vs Unit type[2], but I think it goes off-topic

---

1. https://en.wikipedia.org/wiki/Empty_type

2. https://en.wikipedia.org/wiki/Unit_type