Flooring makes more sense in every case, from years to milliseconds and more. A few reasons:
You want to send a message at exactly 13:00:00, you have a digital clock that doesn't show seconds. What you do is that you watch your clock, and as soon as it goes from 12:59 to 13:00, you press the button. That's also how you set the time precisely by hand. With rounding, that would be 12:59:30, who wants to send a message at precisely 12:59:30 ?
You have a meeting at 13:00:00, you watch the clock to see if you are early or late. With flooring, if you see 13:00, you know you are late. With rounding, you are not sure.
It is common for digital clocks to have big numbers for hours and minutes and small numbers for seconds. If you are not interested in seconds, you can look quickly from afar and you have your usual, floored time. If you want to be precise, you look closer and get your seconds. Rounding the minutes would be wrong, because it wouldn't match the time with seconds. And you don't want clocks with a small display for seconds you may not even see to show a different time than those that don't.
And if you just want to know the approximate time and don't care about being precise to the second, then rounding up or down doesn't really matter.
I think that’s the problem with the article - that it sticks to its guns.
It starts with an outrageous statement, goes on to show that it’s actually correct. Then it relates it to similar things and instead of saying “yeah, just like we floor years and hours it makes sense to do it for minutes too, but it was fun to think about” it goes on to say “but for minutes this is bad”
If it had backtracked and said “flooring is actually the better choice” I would have appreciated the article as a whole much more
> With flooring, if you see 13:00 you know you are late
I always though that you are late from 13:01. Common these days with Teams meetings etc. It seems most people join during the minute from 13:00 to 13:01.
Your second example contradicts the first. "Sending a message" doesn't happen in an instant. There's always some latency, which can be significant in email systems (the most common, universally accepted way to send a message) and SMS.
The goal in messaging is almost always not about the send time (who cares?), but the receive time, when it is available to be read. If the goal is to have the message received at 1:00, then, as you note in the second example, sending it precisely at 1:00:00.000 guarantees it will be received late.
In any case, if you're that focused on intra-minute precision, you should probably be relying on clocks that incorporate seconds anyway.
> This is especially apparent when you're trying to calculate "how much time until my next meeting?", and your next meeting is at noon. If it's 11:55, you would usually mentally subtract and conclude: the meeting is in 5 minutes. That's how I always do it myself anyway! But the most probable estimate given the available information is actually 4'30"!
Ok. But what does it mean for a meeting to start at 12:00 when people don't have clocks that will accurately show 12:00? They'll only know when the time is 11:59:30 or 12:00:30, anything between is just going to be a guess. So it seems to me that the start times would just align to the half-minute offsets instead, and we'd be back exactly where we started but with more complexity.
Both of your pic examples are wrong though. That digital clock does show seconds and the London clock has its minute hand in between minute mark - showing progres between minute mark if you look closely. This is the same for all analogue clocks.
Analogue clocks like the face of big ben are not like digital displays, and whether they "show seconds" in the context of the meaning of this article is not, like digital displays, down to whether there is a dedicated hand.
Unlike digital displays, the largest denomination hand on an analogue clock display contains all of the information that the smaller hands do (depending on the movement in some cases).
The easiest way to realise this is to imagine a clock without the minute hand. Can you tell when it's half-past the hour? You can. The hour hand is half way between the two hours.
Again, it depends on the movement, but it is not out of the question that your minute hand is moving once every second, and not every minute. It is down to the number of beats per unit time for an analogue display as to what the minimum display resolution is (regardless of if the movement is analogue or digital itself).
Unlike digital displays, the largest denomination hand on an analogue clock display contains all of the information that the smaller hands do (depending on the movement in some cases).
You would be surprised. When I was a kid, I sometimes used to stare at the clocks with an analog face at the train station while waiting for the train to school to arrive.
Interestingly enough the seconds hand would go slightly faster than actual seconds and at the 60 seconds the seconds hand would get stuck for a moment as if it was pushing the minutes hand and then the minutes hand would flip to the next minute.
The description describes how they work, which seems like a mixture of digital and analog (due to the use of both cogs and relays + propagation of pulses from central to local clocks), translated:
- The seconds hand makes a revolution of 57-58 seconds and is then stuck for 2-3 seconds.
- The seconds hand is driven using 230V.
- The minutes hand get a 12V or 24V pulse once every 60 seconds. The polarity has to swap every 60 seconds. The swapping of the polarity can be done using a relay or specially-made components.
- The hours hand is driven by the minutes hand using cogs.
> it depends on the movement, but it is not out of the question that your minute hand is moving once every second, and not every minute.
I think the only place where I've seen the minute hand move by the minute has been on TV, in those climactic moments where the camera zooms in on the clock and strikes a certain time. Maybe it's a trope, for emotional tension, like mines that don't explode until you step off.
> The easiest way to realise this is to imagine a clock without the minute hand. Can you tell when it's half-past the hour? You can. The hour hand is half way between the two hours.
Can I? Many analog clocks actually "tick" the second and minute hand. I've even seen some that tick the hour hand.
All universal statements like this are wrong and stem from basic ignorance
> So when it's actually 14:15:45, they'll show 14:15. And when the actual time goes from 14:15:59 to 14:16:00, then that's when your clock changes from 14:15 to 14:16.
No, that's a silly mistake, look at the picture (though much better - video) of the analogue clock to see it's not the case, the minutes hand moves continuously, so isn't at 15 at 15:59
> the meeting is in 5 minutes.
That's not the only question we ask about time. Has the meeting/game already started? You can't answer that with an average value
> for some context appropriate reason) you reply with just hours, you would say it's 11!
No, you reply would depend on the context, which you haven't specified.
> Please someone tell me I'm not crazy
Of course not, just trying to replace one ambiguity with another. Maybe instead come up with a more precise minutes display in digital clocks that adds more info like two dots flashing when it's past 30 sec and only 1 dot when it's before? (In phones you could use color or a few more pixels within all the padding?)
Yup, I think the "has the {thing} already started" is, for many people, the most useful function of precise time anyway. All sorts of work and personal meetings, transportation schedules, doctor's appointments, and so on.
Knowing the ballpark in the form of "it's 15:30-ish", even if more precise, is strictly less useful than "you're late to the 15:30 meeting with your manager".
Fun article nonetheless, and interesting perspectives on both sides!
Not all analog clocks continuously move the minute pointer. In train stations I often see the seconds pointer move continuously and when it reaches the 12, it stops there for a second or two, then the minute pointer moves, stopping at the next minute, briefly vibrating, then the seconds pointer moves on.
It's not really about flooring or rounding, but whether one thinks of time indices as ranges or moments.
Days, as the author points out, are though of with "flooring", but more accurately it could be said that a date is thought of as a range between the times belonging to the date.
Minutes one can consider as ranges or time indices. There the error comes, in switching the interpretation of a start of a duration to an actual estimate of a point of time index.
A minute is an insignificant period for most daily tasks, so the convention "show me when the minute changes" is simple and pragmatic. If your task needs precise count of seconds, you get a clock that shows when the second changes, and now you are half a second late on average.
You can keep playing with increasingly smaller time units until you conclude, like Zeno's arrow paradox, that you're always infinitely late.
Another way of thinking about this is that the author is confusing time as measurement (how much time) with time as rule (what time is it). If you wanted to measure the duration as a difference in clock times, yes, there would be a certain amount of measurement error incurred by the way clocks are displayed. But if you want to know the time, in the sense of whether a certain time has been reached, or a certain graduation has been crossed, it doesn't make sense to round to the nearest minute.
The question of "how much is this clock off?" is only meaningful with reference to a certain use or interpretation of the numbers being displayed. If you say it's "8:56" people know it could be anything up to but not including 8:57, but greater than or equal to 8:56. The number means a threshold in time, not a quantity.
I don't think this applies to Elizabeth Tower/Big Ben, as it's an analogue clock and, from footage I can find[0], its minute hand appears to move continuously opposed to in steps. (or at least, not in full-minute steps)
Also, I believe it's wrong to say "the average error would be 0" if rounding to nearest minute. The average offset would be 0, but the average error would be 15, to my understanding.
That's a good point, I was actually wondering about that. I've seen a lot of jumping analog clocks so I incorrectly assumed Big Ben was the same. I should have checked :)
Hä?! A clock shows the period of time we are currently in. A clock only showing hours would for example indicate that we are in the 14th hour of the day, for the entire duration of that hour. That is not an error. Similarly, a hh:mm clock will show the hour and minute we are currently in for the duration of that minute.
No clock can display the exact current moment of time. That would require infinite digits, and even then those will be late, since lightspeed will ensure you recieve the femtoseconds and below really late.
What time it is, is just made up, it's something we can decide freely through social power, as evidenced by timezones and daylight savings and leap seconds.
Commonly the resolution is something like minutes or a few of them, that's the margin we'll typically accept for starting meetings or precision in public transport like buses.
The utility of femtoseconds in deciding what time it is seems pretty slim.
Yeah, I think labelling it "error" is a bit of a strange way to look at it to be honest.
It's only error if you're trying to measure time in seconds, but are doing it with a clock that can only measure hours and minutes. If you want to know the current minute, then a clock that can measure minutes it is 100% correct.
It's an interesting thought experiment, but really all it's saying is that half of the time 10:00 is closer to 10:01:00 than 10:00:00, but this imply you care about measuring time to the second which prompts the question why it's being measured in minutes?
To be charitable, I suppose in the real world we might occasionally care about how close 10:00 is to 10:01 in seconds, but the time shown on phones can't tell us that so on average we will be about 30 seconds out.
The fallacy is it’s a leap from logic to go from “average error is x” is the same as “is x late”.
Seeing the exact transition is often if not more useful than minimizing average displayed error.
This is nitpicking, but the transition from "the average error of a truncating clock is 30 seconds" to "therefore all clocks are 30 seconds late!" is seriously wrong. For one, the median is equal to the mean here, so about half of all clocks are less than 30 seconds late, which is a clear contradiction.
> Basically I'm arguing that rounding for clocks would be more useful than flooring. This is especially apparent when you're trying to calculate "how much time until my next meeting?"
Yet a rounding clock provides no way at all for you to know whether the meeting has already started or not.
Not sure where I've heard this, but an idea that's been stuck in my head is this: We don't look at clocks to see what time it is, we do so to know what time it isn't yet:
Have I missed the bus yet? Can I already go home? Am I late for this meeting? Do I still have time to cancel this cronjob? All questions that a rounded clock cannot precisely answer.
> Hours: That's when mentally, I switch to rounding! At 15:48 I definitely feel like it's pretty much 16:00.
Disagree. I would never round to a full hour, only to nearest 5 minutes.
As far as minutes, for clocks that show discrete minutes, it'd be weird to see the minute-hand snap to the next number and think "oh, it's actually 29 seconds before that number". Seeing the snap motion means you're at :00 seconds.
Besides, for a clock that doesn't show seconds, it really doesn't matter. If you need more precision, you just use a timepiece with the extra precision.
While most analog clocks' minute hands sweep from minute to minute, jumping minute clocks have the issue the article brings up.
Depending on when / where you went to school, you may have had analog jumping minute clocks. The ones we had at one school would "give away" when the minute would change because the minute hand would move slightly counterclockwise before changing to the next minute.[0]
Per reddit, some Swiss Railway clocks had jumping minutes, but I have not seen one in person. [1]
Another school I attended had sweep second and minute hands, but would hold the clock at 59 seconds until it matched the master clock. Depending on the particular clock and how well it was maintained, these could be 5 - 10 seconds off. Seems like nothing as an adult, but as a kid wanting to go home, it seemed like an eternity, especially on the last day of class for the semester.
Clocks show the time that is _passed/past_, that's the point. Hence flooring is correct approach.
Even when you involve physics, you are seeing the nanosecond past. Not the actual time. (The time that takes light to travel from clock to your eyes, then your brain's processing delay via neurons etc.)
Even the thought of being late is the same way. If something starts at 13:00 and you are not already there when it is 13:00:00.000, you are -by definition- late.
Good point, when you look at your watch your thought should be “3:15 has passed” or “3:15:40 has passed”. One is more precise but if you think of it as a time that has already passed you can budget accordingly. 3:15 tells you that 3:15:00 has definitely passed and as much as 3:15:59 might have passed.
The Clock app in iOS works like this in timer mode. If you start a timer for 10 seconds, you'll see the 10 for half a second, then 9 through 1 for a second each, then 0 for half a second, then it beeps. In practice it's pretty stupid and not really useful, so I wish they'd fix it.
There's a reason why normal countdowns work the way they do. You care about the exact moment it hits zero, and with rounding you lose that.
The writer makes an implicit assumption you just glance at the clock with no prior info.
Where seconds count, you can watch the clock until the moment it ticks. At that time you have greater precision. And due to your own innate sense of time, that precision decays for a little while (maybe even 30s?) rather than instantly disappearing.
I like to synchronize my clocks this way (when seconds are unavailable). Yes, it does mean I invest up to a minute more to set the time, and it's probably not worth it if you're doing so often (eg. area prone to power outages).
Time intervals aren't floors or rounds but thresholds.
This is most evident above the level of hours, minutes, or seconds: we speak of a new day beginning (though the time at which this occurs has varied culturally and regionally), or a new year, decade, century, or millennium (with their own threshold arguments courtesy the AWOL Year Nil in the Julian/Gregorian calendrical system).
Birthdates are an interesting example. In Western/European cultures, it's typical to count by full years elapsed, such that an infant within its first 12 months is 0 years old. In Asian cultures, the age is given as "1", which is reflected in an occasionally-used Western notion of "being in my nth year". That is, a 49 year old is in their 50th year.
What a clock (or calendar) indicates generally is what time period you are presently in, that is, the second of a minute, the minute of an hour, the hour of a day, day of a month, etc., etc. And what's relevant from this point of view is that the threshold is what matters.
The other problem touched on in this essay is of estimating time remaining to a future threshold, and here, rounding is indeed useful. If you have a 10am meeting and the time presently reads 9:55 am, you have fewer than five minutes to arrive punctually. But that is planning and not timekeeping issue, strictly.
> It would be weird if we rounded for years, months and days, that's for sure. I think most people think of those scales as intervals. In other words, July is a period of time, with a start and an end. So are years, centuries, seasons. We are inside of it or outside.
I feel like my sense of time is different from the author's. While it can be useful to round the current hour/minute on some occasions, the information about which exact segment of the day/hour you're in can also be very useful. I can certainly tell that I ask the question of "when exactly is it going to be 12:00?" far more often than "how many seconds have statistically likely elapsed in the current minute?"
The biggest issue for me is that the precise moment of when one minute/hour transitions into the next is important for people. Like, when coordinating an event or meeting, would you prefer it if your clock indicated the precise moment when 12:59:59 becomes 13:00:00 and told you to start the meeting, or would it be better if the clock instead told you that it was "13ish" and you'd have to wait out ~30 seconds by counting in your head?
This also causes a jarring discontinuity - now clocks with a ticking hour hand appear to run 30 seconds late than clocks without, turning on the digital clock setting to show seconds offsets it, and so on. Some people celebrate New Year's or occasions that happen at a specific time 30 seconds early because they no longer have a strong reference point.
[+] [-] GuB-42|1 year ago|reply
You want to send a message at exactly 13:00:00, you have a digital clock that doesn't show seconds. What you do is that you watch your clock, and as soon as it goes from 12:59 to 13:00, you press the button. That's also how you set the time precisely by hand. With rounding, that would be 12:59:30, who wants to send a message at precisely 12:59:30 ?
You have a meeting at 13:00:00, you watch the clock to see if you are early or late. With flooring, if you see 13:00, you know you are late. With rounding, you are not sure.
It is common for digital clocks to have big numbers for hours and minutes and small numbers for seconds. If you are not interested in seconds, you can look quickly from afar and you have your usual, floored time. If you want to be precise, you look closer and get your seconds. Rounding the minutes would be wrong, because it wouldn't match the time with seconds. And you don't want clocks with a small display for seconds you may not even see to show a different time than those that don't.
And if you just want to know the approximate time and don't care about being precise to the second, then rounding up or down doesn't really matter.
[+] [-] wodenokoto|1 year ago|reply
It starts with an outrageous statement, goes on to show that it’s actually correct. Then it relates it to similar things and instead of saying “yeah, just like we floor years and hours it makes sense to do it for minutes too, but it was fun to think about” it goes on to say “but for minutes this is bad”
If it had backtracked and said “flooring is actually the better choice” I would have appreciated the article as a whole much more
[+] [-] mongol|1 year ago|reply
I always though that you are late from 13:01. Common these days with Teams meetings etc. It seems most people join during the minute from 13:00 to 13:01.
[+] [-] TZubiri|1 year ago|reply
[+] [-] andy800|1 year ago|reply
The goal in messaging is almost always not about the send time (who cares?), but the receive time, when it is available to be read. If the goal is to have the message received at 1:00, then, as you note in the second example, sending it precisely at 1:00:00.000 guarantees it will be received late.
In any case, if you're that focused on intra-minute precision, you should probably be relying on clocks that incorporate seconds anyway.
[+] [-] jsnell|1 year ago|reply
Ok. But what does it mean for a meeting to start at 12:00 when people don't have clocks that will accurately show 12:00? They'll only know when the time is 11:59:30 or 12:00:30, anything between is just going to be a guess. So it seems to me that the start times would just align to the half-minute offsets instead, and we'd be back exactly where we started but with more complexity.
[+] [-] hartator|1 year ago|reply
Both of your pic examples are wrong though. That digital clock does show seconds and the London clock has its minute hand in between minute mark - showing progres between minute mark if you look closely. This is the same for all analogue clocks.
[+] [-] zemnmez|1 year ago|reply
Analogue clocks like the face of big ben are not like digital displays, and whether they "show seconds" in the context of the meaning of this article is not, like digital displays, down to whether there is a dedicated hand.
Unlike digital displays, the largest denomination hand on an analogue clock display contains all of the information that the smaller hands do (depending on the movement in some cases).
The easiest way to realise this is to imagine a clock without the minute hand. Can you tell when it's half-past the hour? You can. The hour hand is half way between the two hours.
Again, it depends on the movement, but it is not out of the question that your minute hand is moving once every second, and not every minute. It is down to the number of beats per unit time for an analogue display as to what the minimum display resolution is (regardless of if the movement is analogue or digital itself).
[+] [-] microtonal|1 year ago|reply
You would be surprised. When I was a kid, I sometimes used to stare at the clocks with an analog face at the train station while waiting for the train to school to arrive.
Interestingly enough the seconds hand would go slightly faster than actual seconds and at the 60 seconds the seconds hand would get stuck for a moment as if it was pushing the minutes hand and then the minutes hand would flip to the next minute.
Found a video here:
https://www.youtube.com/watch?v=ruGggPYQqHI
The description describes how they work, which seems like a mixture of digital and analog (due to the use of both cogs and relays + propagation of pulses from central to local clocks), translated:
- The seconds hand makes a revolution of 57-58 seconds and is then stuck for 2-3 seconds.
- The seconds hand is driven using 230V.
- The minutes hand get a 12V or 24V pulse once every 60 seconds. The polarity has to swap every 60 seconds. The swapping of the polarity can be done using a relay or specially-made components.
- The hours hand is driven by the minutes hand using cogs.
Edit: more information and references here: https://en.wikipedia.org/wiki/Swiss_railway_clock#Technology
[+] [-] decentralised|1 year ago|reply
No need to imagine it, it's been invented many years ago and it's called a perigraph. Meistersinger makes one of the nicest I've seen: https://www.relogios.pt/meistersinger-perigraph-relogio-auto...
[+] [-] jolmg|1 year ago|reply
I think the only place where I've seen the minute hand move by the minute has been on TV, in those climactic moments where the camera zooms in on the clock and strikes a certain time. Maybe it's a trope, for emotional tension, like mines that don't explode until you step off.
[+] [-] kuschku|1 year ago|reply
Can I? Many analog clocks actually "tick" the second and minute hand. I've even seen some that tick the hour hand.
[+] [-] deaddodo|1 year ago|reply
A digital clock is 1:01 or 1:02. An analog clock is some tick of some range (depending on the resolution, as you abstracted), at all times.
[+] [-] eviks|1 year ago|reply
> So when it's actually 14:15:45, they'll show 14:15. And when the actual time goes from 14:15:59 to 14:16:00, then that's when your clock changes from 14:15 to 14:16.
No, that's a silly mistake, look at the picture (though much better - video) of the analogue clock to see it's not the case, the minutes hand moves continuously, so isn't at 15 at 15:59
> the meeting is in 5 minutes.
That's not the only question we ask about time. Has the meeting/game already started? You can't answer that with an average value
> for some context appropriate reason) you reply with just hours, you would say it's 11!
No, you reply would depend on the context, which you haven't specified.
> Please someone tell me I'm not crazy
Of course not, just trying to replace one ambiguity with another. Maybe instead come up with a more precise minutes display in digital clocks that adds more info like two dots flashing when it's past 30 sec and only 1 dot when it's before? (In phones you could use color or a few more pixels within all the padding?)
[+] [-] verzali|1 year ago|reply
[+] [-] animuchan|1 year ago|reply
Knowing the ballpark in the form of "it's 15:30-ish", even if more precise, is strictly less useful than "you're late to the 15:30 meeting with your manager".
Fun article nonetheless, and interesting perspectives on both sides!
[+] [-] zelphirkalt|1 year ago|reply
[+] [-] keskival|1 year ago|reply
Days, as the author points out, are though of with "flooring", but more accurately it could be said that a date is thought of as a range between the times belonging to the date.
Minutes one can consider as ranges or time indices. There the error comes, in switching the interpretation of a start of a duration to an actual estimate of a point of time index.
[+] [-] ASalazarMX|1 year ago|reply
You can keep playing with increasingly smaller time units until you conclude, like Zeno's arrow paradox, that you're always infinitely late.
[+] [-] derbOac|1 year ago|reply
Another way of thinking about this is that the author is confusing time as measurement (how much time) with time as rule (what time is it). If you wanted to measure the duration as a difference in clock times, yes, there would be a certain amount of measurement error incurred by the way clocks are displayed. But if you want to know the time, in the sense of whether a certain time has been reached, or a certain graduation has been crossed, it doesn't make sense to round to the nearest minute.
The question of "how much is this clock off?" is only meaningful with reference to a certain use or interpretation of the numbers being displayed. If you say it's "8:56" people know it could be anything up to but not including 8:57, but greater than or equal to 8:56. The number means a threshold in time, not a quantity.
[+] [-] Ukv|1 year ago|reply
Also, I believe it's wrong to say "the average error would be 0" if rounding to nearest minute. The average offset would be 0, but the average error would be 15, to my understanding.
[0]: https://www.youtube.com/watch?v=eUP3DsiqkzA
[+] [-] fouronnes3|1 year ago|reply
[+] [-] jtbayly|1 year ago|reply
"If clocks rounded to the nearest minute instead of truncating, the average error would be 0.”
The negative and the positive error don’t cancel each other out. They are both error. The absolute value needs to be used.
[+] [-] lupire|1 year ago|reply
[+] [-] Straw|1 year ago|reply
[+] [-] lavelganzu|1 year ago|reply
[+] [-] fouronnes3|1 year ago|reply
[+] [-] mglz|1 year ago|reply
No clock can display the exact current moment of time. That would require infinite digits, and even then those will be late, since lightspeed will ensure you recieve the femtoseconds and below really late.
[+] [-] cess11|1 year ago|reply
Commonly the resolution is something like minutes or a few of them, that's the margin we'll typically accept for starting meetings or precision in public transport like buses.
The utility of femtoseconds in deciding what time it is seems pretty slim.
[+] [-] v4vvdq|1 year ago|reply
[+] [-] kypro|1 year ago|reply
It's only error if you're trying to measure time in seconds, but are doing it with a clock that can only measure hours and minutes. If you want to know the current minute, then a clock that can measure minutes it is 100% correct.
It's an interesting thought experiment, but really all it's saying is that half of the time 10:00 is closer to 10:01:00 than 10:00:00, but this imply you care about measuring time to the second which prompts the question why it's being measured in minutes?
To be charitable, I suppose in the real world we might occasionally care about how close 10:00 is to 10:01 in seconds, but the time shown on phones can't tell us that so on average we will be about 30 seconds out.
[+] [-] epcoa|1 year ago|reply
[+] [-] macleginn|1 year ago|reply
[+] [-] lxgr|1 year ago|reply
Yet a rounding clock provides no way at all for you to know whether the meeting has already started or not.
Not sure where I've heard this, but an idea that's been stuck in my head is this: We don't look at clocks to see what time it is, we do so to know what time it isn't yet:
Have I missed the bus yet? Can I already go home? Am I late for this meeting? Do I still have time to cancel this cronjob? All questions that a rounded clock cannot precisely answer.
[+] [-] taco_emoji|1 year ago|reply
Disagree. I would never round to a full hour, only to nearest 5 minutes.
As far as minutes, for clocks that show discrete minutes, it'd be weird to see the minute-hand snap to the next number and think "oh, it's actually 29 seconds before that number". Seeing the snap motion means you're at :00 seconds.
Besides, for a clock that doesn't show seconds, it really doesn't matter. If you need more precision, you just use a timepiece with the extra precision.
[+] [-] daggersandscars|1 year ago|reply
Depending on when / where you went to school, you may have had analog jumping minute clocks. The ones we had at one school would "give away" when the minute would change because the minute hand would move slightly counterclockwise before changing to the next minute.[0]
Per reddit, some Swiss Railway clocks had jumping minutes, but I have not seen one in person. [1]
Another school I attended had sweep second and minute hands, but would hold the clock at 59 seconds until it matched the master clock. Depending on the particular clock and how well it was maintained, these could be 5 - 10 seconds off. Seems like nothing as an adult, but as a kid wanting to go home, it seemed like an eternity, especially on the last day of class for the semester.
[0] This video shows how clocks worked at my school: https://www.youtube.com/watch?v=jpU_lG_TPP4
[1] https://www.reddit.com/r/clocks/comments/10714a1/hard_time_f...
[+] [-] pvtmert|1 year ago|reply
Even when you involve physics, you are seeing the nanosecond past. Not the actual time. (The time that takes light to travel from clock to your eyes, then your brain's processing delay via neurons etc.)
Even the thought of being late is the same way. If something starts at 13:00 and you are not already there when it is 13:00:00.000, you are -by definition- late.
[+] [-] Aurelius108|1 year ago|reply
[+] [-] iainmerrick|1 year ago|reply
There's a reason why normal countdowns work the way they do. You care about the exact moment it hits zero, and with rounding you lose that.
Clocks are not 30 seconds late.
[+] [-] CharlesW|1 year ago|reply
[+] [-] rkagerer|1 year ago|reply
The writer makes an implicit assumption you just glance at the clock with no prior info.
Where seconds count, you can watch the clock until the moment it ticks. At that time you have greater precision. And due to your own innate sense of time, that precision decays for a little while (maybe even 30s?) rather than instantly disappearing.
I like to synchronize my clocks this way (when seconds are unavailable). Yes, it does mean I invest up to a minute more to set the time, and it's probably not worth it if you're doing so often (eg. area prone to power outages).
[+] [-] dredmorbius|1 year ago|reply
This is most evident above the level of hours, minutes, or seconds: we speak of a new day beginning (though the time at which this occurs has varied culturally and regionally), or a new year, decade, century, or millennium (with their own threshold arguments courtesy the AWOL Year Nil in the Julian/Gregorian calendrical system).
Birthdates are an interesting example. In Western/European cultures, it's typical to count by full years elapsed, such that an infant within its first 12 months is 0 years old. In Asian cultures, the age is given as "1", which is reflected in an occasionally-used Western notion of "being in my nth year". That is, a 49 year old is in their 50th year.
What a clock (or calendar) indicates generally is what time period you are presently in, that is, the second of a minute, the minute of an hour, the hour of a day, day of a month, etc., etc. And what's relevant from this point of view is that the threshold is what matters.
The other problem touched on in this essay is of estimating time remaining to a future threshold, and here, rounding is indeed useful. If you have a 10am meeting and the time presently reads 9:55 am, you have fewer than five minutes to arrive punctually. But that is planning and not timekeeping issue, strictly.
[+] [-] tavavex|1 year ago|reply
I feel like my sense of time is different from the author's. While it can be useful to round the current hour/minute on some occasions, the information about which exact segment of the day/hour you're in can also be very useful. I can certainly tell that I ask the question of "when exactly is it going to be 12:00?" far more often than "how many seconds have statistically likely elapsed in the current minute?"
The biggest issue for me is that the precise moment of when one minute/hour transitions into the next is important for people. Like, when coordinating an event or meeting, would you prefer it if your clock indicated the precise moment when 12:59:59 becomes 13:00:00 and told you to start the meeting, or would it be better if the clock instead told you that it was "13ish" and you'd have to wait out ~30 seconds by counting in your head?
This also causes a jarring discontinuity - now clocks with a ticking hour hand appear to run 30 seconds late than clocks without, turning on the digital clock setting to show seconds offsets it, and so on. Some people celebrate New Year's or occasions that happen at a specific time 30 seconds early because they no longer have a strong reference point.