Yes, it's more of a convention where we assume language like "...ignoring the trivial case of 1 being an obvious factor of every integer." It's not interesting or meaningful, so we ignore it for most cases.
"...ignoring the trivial case of 1 being an obvious factor of every integer."
I remember quite a big chunk of GEB formally defining how integers are really not trivial! The main problem seems to be is that you soon end up with circular reasoning if you are not razor sharp with your definitions. That's just in an explainer book 8)
Correct, it's impossible to specifically and formally define the natural numbers so that addition and multiplication work. Any definition of the natural numbers will also define things that look very similar to natural numbers but are not actually natural numbers.
seanhunter|10 months ago
gerdesj|10 months ago
"...ignoring the trivial case of 1 being an obvious factor of every integer."
I remember quite a big chunk of GEB formally defining how integers are really not trivial! The main problem seems to be is that you soon end up with circular reasoning if you are not razor sharp with your definitions. That's just in an explainer book 8)
Then you have to define what factor means ...
Maxatar|10 months ago
drdeca|10 months ago
Specifically, the case of the divisor being 1.