120 is divisible by a lot of numbers, which is convenient for many calculations. There's a similar reasoning behind 24 hour days, 60 minute hours, and 360 degrees in a circle.
I'm developing a big-pixel retro platformer game. Rather than the traditional power-of-two tile sizes, I ended up going for 36x36 tiles. It's much easier to draw repeating patterns when tiles can be evenly divided into halves, thirds, quarters, sixths, and so on.
Seems like the best of all world would be a do-decimal (12-based) based system of units along with do-decimal notation and two extra digits to notate it. But I can't see us switching any time soon!
This is the kind of thing I love to use in conversation with my girlfriend because it annoys her. Basically anything that sounds off, or odd but is technically correct.
The last thing I learned before this was the word coolth.
The native base of our computing systems is base-2. Engineering prefixes for this base should thus also be based on natively divisible break points.
It just so happens that some break points are usefully close (but not quite the same as) base 10 break points.
2^10 = 1024 ~~ (1 KB) ~~ 1000
Naturally given this 2.4% bonus to base 10, and failing upwards in being able to cleanly store the corresponding base 10 datatype, the natural thing to do is to favor the larger and also native unit.
For me, KB will _always_ be 1024 bytes. In the context of computers this is what makes sense.
Similarly I only ever want those to be the binary (base 2, also implied by bytes) base sizes for all other units of storage, data transfer speed, etc. The marketing fixation on providing less value by focusing on the non-native to the system human sizes that are smaller is the error.
Counter-argument to anyone arguing otherwise. What is the size of the smallest addressable unit in any of the block based device storage you care about? Hard disks / flash? (512, 4096, or possibly some very large power of 2 for SMR). CDs/DVDs? (2048 bytes)
In my opinion the most important factor by far is avoiding ambiguity. This means sucking up to the binary prefixes and using them in every scenario possible.
A lot of people don't like how they sound, and would like for the standard prefixes to have one meaning, but the reality of the state of the world is such that you will create ambiguity if you don't use the binary prefixes.
Does anyone actually have a problem with kB meaning 1024 bytes and MB meaning 1<<20 bytes, etc?
Sure, the kilo-, mega- etc prefixes might literally mean 10^x but as long as one is raised with the understanding that to do so is overly literal, then it’s never really a problem. base = 2 if context == computer else 10
Kibi and mibi feel like this generation’s centripetal force. Another popular and unsolicited explanation I get a lot is about equality vs equity. The distinction might be real but after being corrected for the nth time one feels like I am being corrected for the sake of it than for any real reason.
What’s bad about the binary prefixes? Does it help if you think about how it’s designed to be “MEga BInary” and “GIga BInary”? Or is that the problem?
The talk of weird sounding binary prefixes and long scales reminds me of the British long scale for large numbers that has different names for what we know as ‘million’ and ‘billion’: “milliard” and “billiard” (not to be confused with pool.) https://en.wikipedia.org/wiki/Long_and_short_scales
This is the basis for customary nail sizes, as it's (for example) how many 6d nails you could get for 6 pennies in medieval England.
https://en.wikipedia.org/wiki/Penny_(unit)
I'm of the opinion - though not 100% confident - that you can tell the difference between houses built with metric and imperial materials.
I think you can perceive the different factor relationship between the dimensions of timber, fittings and spacing. Although I'm not entirely certain that I'm not picking up other cultural habits in Australian and American building styles. Should I ever ascend the throne, I shall have an imperial palace (naturally) and a metric one built to the same plan to verify my intuition
This is relevant in my field (Waste Management) as there's the long ton, the short ton, and long and short hundredweight (CWT). All different and confusing!
> Within the original Latin text, the numeral c. is used for a value of 120: Et quodlibet c. continet vi. xx. ("And each such 'hundred' contains six twenties.")[2]
This implies those Roman Numerals we learned in school are wrong.
We have a lot of place names in Scandinavia and the former Danish Viking colonies, and apparently in Pennsylvania and New Jersey, reminiscent of this. A "hundred" is a geographical entity that can deliver a long hundred men to the armed forces.
Given the barriers to communication back then it feels like it didn't take them all that long to make their way to Europe. The numeral system seems to have been codified in India by 700, then extended by the Arabs to decimals and fractions by 900, then began making its way into Europe by 1000. The callout to the 14th century might just be because that's about when the printing press was developed, which caused the dissemination of knowledge to really start kicking off in general.
So wait. The English once used C to mean what we now notate as "120" and the Romans used C to mean "100"? Wouldn't that have been super confusing when the English sent trade ships to mainland Europe? Especially in the days when England was a Roman colony?
And at one time the British spoke German? Fascinating. I always thought the single biggest language influence on British was French (because William the Conqueror), and the pre-Conquest indigeneous British was a mix of Celtic, Roman, and Norse from the Vikings that's more-or-less unique. How'd they end up speaking German?
As I understand it, the various bits of medieval England could barely agree on units among each other.
"England" is named after the Angles, one of the Germanic tribes that showed up and settled down in England after Roman rule ended. They brought their own proto-Germanic language with them and displaced the local languages- and to some extent people too. French was a comparatively small addition, it contributed a lot of vocabulary and spelling and so on, less so grammar. The Norman invasion was not a lot of actual people- enough to staff up a new aristocratic class speaking French, not enough to displace the local language or people so much.
English is a Germanic language (as are German, Swedish, Dutch, etc). The largest components of its grammar are similar to other Germanic languages, as is a lot of vocabulary (butter, swine, cow, deer....).
A bunch of French words were grafted on recently (not in 1066 but over the following couple of centuries), but none of the grammar came with it.
As for German-speaking people in England specifically: well, George I and George II spoke German (and French), not English. George III was the first Hanoverian king to speak English. Not surprisingly, he was mad :-)
You've probably heard of Anglo Saxons? After the Roman withdrawal came the Anglo Saxon invasion from Germany and Old German became the predominant language of England, even the name comes from these invaders (Angles -> Angle Land -> England). After that the Vikings and French added their layers to varying degrees, Celtic languages remained dominant in Ireland, Scotland and Wales until much later.
Probably the same way Americans and English can trade despite their gallons being different. Or the Australians and the Americans despite their dollar being different. It was only last century that for the English, a billion was 1e12 while it was 1e9 for Americans. I'm sure it was a pain but humans can manage details like that.
[+] [-] wave_function|5 years ago|reply
https://en.wikipedia.org/wiki/Highly_composite_number
https://en.wikipedia.org/wiki/Sexagesimal
[+] [-] Waterluvian|5 years ago|reply
[+] [-] fleabitdev|5 years ago|reply
[+] [-] nicoburns|5 years ago|reply
[+] [-] zz865|5 years ago|reply
[+] [-] burnt_toast|5 years ago|reply
The last thing I learned before this was the word coolth.
[+] [-] guerrilla|5 years ago|reply
1. https://en.wiktionary.org/wiki/hundred#Etymology
2. https://en.wiktionary.org/wiki/centum#Latin
[+] [-] danaliv|5 years ago|reply
[+] [-] mjevans|5 years ago|reply
It just so happens that some break points are usefully close (but not quite the same as) base 10 break points.
2^10 = 1024 ~~ (1 KB) ~~ 1000
Naturally given this 2.4% bonus to base 10, and failing upwards in being able to cleanly store the corresponding base 10 datatype, the natural thing to do is to favor the larger and also native unit.
For me, KB will _always_ be 1024 bytes. In the context of computers this is what makes sense.
Similarly I only ever want those to be the binary (base 2, also implied by bytes) base sizes for all other units of storage, data transfer speed, etc. The marketing fixation on providing less value by focusing on the non-native to the system human sizes that are smaller is the error.
Counter-argument to anyone arguing otherwise. What is the size of the smallest addressable unit in any of the block based device storage you care about? Hard disks / flash? (512, 4096, or possibly some very large power of 2 for SMR). CDs/DVDs? (2048 bytes)
[+] [-] Skunkleton|5 years ago|reply
[+] [-] Asraelite|5 years ago|reply
A lot of people don't like how they sound, and would like for the standard prefixes to have one meaning, but the reality of the state of the world is such that you will create ambiguity if you don't use the binary prefixes.
[+] [-] sellyme|5 years ago|reply
[+] [-] gorgoiler|5 years ago|reply
Sure, the kilo-, mega- etc prefixes might literally mean 10^x but as long as one is raised with the understanding that to do so is overly literal, then it’s never really a problem. base = 2 if context == computer else 10
Kibi and mibi feel like this generation’s centripetal force. Another popular and unsolicited explanation I get a lot is about equality vs equity. The distinction might be real but after being corrected for the nth time one feels like I am being corrected for the sake of it than for any real reason.
[+] [-] dahart|5 years ago|reply
The talk of weird sounding binary prefixes and long scales reminds me of the British long scale for large numbers that has different names for what we know as ‘million’ and ‘billion’: “milliard” and “billiard” (not to be confused with pool.) https://en.wikipedia.org/wiki/Long_and_short_scales
[+] [-] lifthrasiir|5 years ago|reply
[+] [-] spqr0a1|5 years ago|reply
[+] [-] monkeycantype|5 years ago|reply
I think you can perceive the different factor relationship between the dimensions of timber, fittings and spacing. Although I'm not entirely certain that I'm not picking up other cultural habits in Australian and American building styles. Should I ever ascend the throne, I shall have an imperial palace (naturally) and a metric one built to the same plan to verify my intuition
[+] [-] tnash|5 years ago|reply
[+] [-] throwaway_2047|5 years ago|reply
[+] [-] mikewarot|5 years ago|reply
This implies those Roman Numerals we learned in school are wrong.
[+] [-] Grustaf|5 years ago|reply
[+] [-] orangepanda|5 years ago|reply
Until 1400s or 1800s?
[+] [-] lehi|5 years ago|reply
[+] [-] peteretep|5 years ago|reply
EDIT: For English
[+] [-] peteretep|5 years ago|reply
It is only recently I learned that the spread of Arabic Numerals was so recent
[+] [-] kibwen|5 years ago|reply
[+] [-] csense|5 years ago|reply
And at one time the British spoke German? Fascinating. I always thought the single biggest language influence on British was French (because William the Conqueror), and the pre-Conquest indigeneous British was a mix of Celtic, Roman, and Norse from the Vikings that's more-or-less unique. How'd they end up speaking German?
[+] [-] roywiggins|5 years ago|reply
"England" is named after the Angles, one of the Germanic tribes that showed up and settled down in England after Roman rule ended. They brought their own proto-Germanic language with them and displaced the local languages- and to some extent people too. French was a comparatively small addition, it contributed a lot of vocabulary and spelling and so on, less so grammar. The Norman invasion was not a lot of actual people- enough to staff up a new aristocratic class speaking French, not enough to displace the local language or people so much.
[+] [-] gumby|5 years ago|reply
A bunch of French words were grafted on recently (not in 1066 but over the following couple of centuries), but none of the grammar came with it.
As for German-speaking people in England specifically: well, George I and George II spoke German (and French), not English. George III was the first Hanoverian king to speak English. Not surprisingly, he was mad :-)
[+] [-] flukus|5 years ago|reply
> How'd they end up speaking German?
You've probably heard of Anglo Saxons? After the Roman withdrawal came the Anglo Saxon invasion from Germany and Old German became the predominant language of England, even the name comes from these invaders (Angles -> Angle Land -> England). After that the Vikings and French added their layers to varying degrees, Celtic languages remained dominant in Ireland, Scotland and Wales until much later.
[+] [-] exporectomy|5 years ago|reply
[+] [-] kevinmchugh|5 years ago|reply
German grammar is pretty easy to pick up for English speakers. Lots of sentences are constructed the same way in German and English.
[+] [-] prionassembly|5 years ago|reply
[+] [-] hprotagonist|5 years ago|reply
[+] [-] LargoLasskhyfv|5 years ago|reply
[+] [-] CyberRabbi|5 years ago|reply
[+] [-] throwawayboise|5 years ago|reply
[+] [-] rufusroflpunch|5 years ago|reply