This periodic adjustment mainly benefits scientists and astronomers as it allows them to observe celestial bodies using UTC for most purposes
This is incorrect, dealing with leap seconds is a major problem in astronomy, requiring a data file or even recompile any time there is a new one announced. Since they are only ever announced 6 months in advance, it creates a lot of logistical problems.
Astronomy algorithms usually work in Barycentric Dynamic Time, or Terrestrial Time, or UT1. In reality the whole invention of UTC is so that people don't have to deal with those systems on a daily basis.
The process most astronomy programs go through is to get the UTC from the user, convert the Gregorian date to a Julian day number to get rid of the Gregorian Calendar altogether. Then look up the number of leap seconds, add those to the JD to get International Atomic Time, then add 32.184 seconds to get Terrestrial Time. If Barycentric Dynamic Time is needed, you must first compute the velocity of the Earth relative to the solar system barycenter (which itself requires TDB), then compute the relativistic effects of that motion on Terrestrial Time. If you need UT1, these can only be obtained by observation from the International Earth Rotation Service, and required daily updates, and interpolation of values in between observations.
So, as you can see, leap seconds do nothing to make an astronomer's life easier. In fact, we jump through a lot of hoops to make the average person's life easier. It sounds like Meta wants to change the definition of time for everyone just to make their programming a little easier. I find the very premise just out right ridiculous.
It's true that your average person will not be affected by the Sun setting a few seconds earlier. But eventually it will build up enough error that it eventually has to be addressed, and Meta is just trying to make their problem someone else's problem.
Everyone is treating this as some idiosyncratic proposal of Facebook's, but removing leap seconds is a mainstream position. Representatives of the US, China, France, and a majority of other countries were in favor of this when it was discussed at the 2015 ITU meeting [1] though the UK and Russia's were not. This has been under discussion since at least the 2003 ITU meeting in Turin.
Can't help but agree with 80% of this post but strongly disagree with the solution. This feels like a hack that punts the problems to the future.
UTC has the leap second cause it's not "real time" and so now we're just gonna never sync up UTC to real time at all? How is that the solution? Either we deal with leap seconds or we need to implement something that can't go backwards and properly models time. Leap seconds seem much simpler...
In the end we didn't get rid of SQL cause of SQL Injection. We fixed the frameworks and promoted the solutions. We may simply need to make a push for languages and etc to just properly support time and promote how to do things correctly. It honestly seems easier.
A million problems started when system clocks were changed to follow UTC (as opposed to local time) and then UTC was conflated with Unix time - a fixed monotonic reference, which UTC is not!
Though the ship has sailed, I think it could have been much better if computers were set to follow TAI Time (atomic clock time - unaffected by leap seconds) time than the UTC. UTC is as variable as localtime and should have been treated as such.
If fb wants to - they can (and should) use TAI time for system reference.
For systems that leap seconds actually cause problems on, the solution is simply to use International Atomic Time (TAI) internally, and convert it to UTC when you want to display information to a user.
Every time I see ditching leap seconds come up, they never try to explain why TAI won't work for them, leading me to believe they probably just don't know it exists, nor could they even imagine something like it being invented.
>remain at the current level of 27, which we believe will be enough for the next millennium.
I would have loved to read more justification about _why_ Meta thinks we no longer need the leap second beyond calling it a community push. They did a great job of complaining about how hard it is to solve from a technical perspective, and then explained how they solved it. Is the only problem really that Meta doesn't know how to test a negative leap second?
Meta has it backwards, because they could've already made another choice. UTC is a representation of the offset. This will always need to be calculated somewhere. Much like how the GPS counts the time since the epoch and broadcasts the offset...it can be used or not by the user. Those times are a calibration adjustment. For UTC, that adjustment is a reference to the current status provided by IERS. That's literally the job of UTC.
Meta can simply stop applying a time offset in their reference, use TAI for forensics, and then have a separately calculated time when they need to display in that representation.
...Which is what happens already. System time is seconds since The Unix Epoch. All of those times are calculated and available, and always will be. They chose one of them, didn't like it, and invented a crummy workaround. They could've logged it in TAI and appended the TAI TZ to all their timestamps.
This is being done to work around poor coding choices they made, instead of making the computing fit reality. Basically "We chose smearing, which is a poopstain of tech debt, so we'll fix it by telling the whole world what time it is." That request is loaded with colossal hubris.
They might as well have the second redefined. The real operators do their thing and leave the squawking to those who want to self-identify a poor coders. Because if they don't want to account for a clock that jumps, then they're kicking the can down the road and they want everyone else to join.
Turning it into a Y2K or Y2038 problem is a sad choice of saddling bigger tech debt on the rest of the world.
Calling it mainstream is as much of a narrative as permanent daylight savings time was. The software solutions exist and were deployed (at least throughout Linux and main userspace) by the June 2015 leap.
This is a Facebook problem that they're sloppily handling by pushing it on literally everyone else.
The bigger problem is that UTC (and TAI) is defined in a gravity well so it's not going to be very useful in the long run. GPS has to correct its clocks to keep track of what we slow Cesium/Rubidium down to on the surface, and Voyager's clock is going even faster. We clearly want an Earth-centric time standard for wall clocks and that is UTC. The Earth is not a precision time-piece so we will always have to adjust our wall clocks to its rotation. Realistically, we should probably be deriving a time standard where every day has a slightly different length and we record the timestamps of the beginning of each day relative to a universal monotonic clock in a log (with rollups to years, centuries, etc.) that we keep around as long as anyone cares exactly how many Cesium vibrations have happened since $whence.
If we want to actually solve the problem then let's switch to an interstellar time standard in a rest frame relative to the CMBR as far outside of gravity wells as possible and make that the universal monotonically increasing standard. Then computers can run on that time standard and UTC and friends can be derivatives.
A lot of people in this thread are criticizing this move, but let me offer an opposite view.
One of the largest electronic health records systems has code that predates the UNIX epoch. Much of the time handling code is custom written to deal with this. However, the code was so poorly written that the system would lose data during the double 1 am window that occurs during daylight savings time shift. Hospitals would just shut off all of their computers during this time to deal with it.
As the article notes, issues with leap seconds have also brought down reddit and cloudflare. Many people in this thread are treating this like some sort of display of incompetence, but if you've ever written code that deeply interacts with time, you'd know how difficult it is to get right. A sign of a good system is one where it is difficult to fuck up.
IMO it is better to guarantee that time always moves forward rather than trying to match computer time to human time.
start := time.Now()
// do something
spent := time.Now().Sub(start)
It's worth noting that the Go time library is specifically designed so that computer clocks running backwards won't cause `spent` to be a negative duration. A monotonic clock that only ticks forward is used for time comparison and subtractions.
It's frustrating that programmers want to redefine civil time just because it is "hard". This article glosses over the real world problems that detaching from UTC will cause.
(You may want to scroll down to "Implementing the plan outlined at Torino".)
If we end leap seconds, it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
By 2055, the "minute" displayed on a clock may be incorrect, which again may cause issues with legal timestamps.
And by 2083, sundials are measurably wrong.
All because programmers wanted to save some lines of code.
> It's frustrating that programmers want to redefine civil time just because it is "hard". This article glosses over the real world problems that detaching from UTC will cause.
I agree, but I'm also - sad to say - less than surprised to find engineers at a Big Tech firm taking a high-handed, not to mention narrow and ill-informed, approach over the issue and trying to impose their will on a global scale. My worry here is that, Meta being Meta, they carry quite a lot of influence and may actually gain some traction.
EDIT: I'll add a bit more colour here. At the core of our platform we manage a database containing billions of legacy timestamped records (or events, if you prefer), adding more and more every day. Without even giving it a great deal of thought I guarantee you that this proposal will cause us more problems that it solves and will distract us from making more valuable investments of time and effort that would benefit our business should it be implemented. Sure, we can no doubt fix all these problems, but we've got better things to do. I imagine that many other businesses would be similarly affected and would take a similar view.
If you stop thinking about time being wrong from what is officially correct, and instead see this whole exercise as a error minimization framework I think it is far easier to make the case for ending leap seconds as it is for keeping them.
This isn't just about lines of missing code. This is about forcing subterranean or submerged computers to surface. This is about out of sync clocks across information propagation networks across planets. This is about real lives that are ruined because time stamps didn't quite line up, causing delays, deaths, and needless headaches.
It doesn't need to be this way. We could just accept a minute of the clock being off from "true" midnight, which doesn't even make sense to me given that few people are right at the astronomic point where midnight is "true" midnight for their timezone. Heck, China is one big giant timezone so who is this actually for, really? The people that care about sundials? Most people don't even grow their own food.
We're no longer a sun-driven economy. Well coordinated timekeeping across devices that may not always be able to transfer data is far, far more important. If it's sufficiently wrong by the year 3422 then we'll deal with the fifteen minutes of annoyance then. This is a crazy premature optimization.
> It's frustrating that programmers want to redefine civil time just because it is "hard".
Yes. Problems with delay time going negative usually come from not using CLOCK_MONOTONIC for delay time. CLOCK_MONOTONIC is usually just the time since system startup. It comes from QNX (which, being hard real time, had to deal with this first), made it into the POSIX spec around 1996, and is now available on all major OSs. But there's still software that uses time of day where CLOCK_MONOTINIC is needed.
Then there's the smoothing approach. This document described Facebook's smoothing approach, which has a different smoothing period than Google uses.
* Facebook/Meta: "We smear the leap second throughout 17 hours, starting at 00:00:00 UTC based on the time zone data (tzdata) package content." This is puzzling. What does the time zone package have to do with UTC?
* Google: 24-hour linear smear from noon to noon UTC.[1]
* AWS: Follows Google.
* US power grid: Starts at midnight UTC and takes a few hours while all the rotating machinery takes 60 extra turns to catch up.
>It's frustrating that programmers want to redefine civil time just because it is "hard". This article glosses over the real world problems that detaching from UTC will cause.
Yes, the actual problem exists, and ignoring/discarding reality (i.e. the "science" in computer science) will just cause further problems. If you and your modern stack of code can't handle the leap second, it's simply not production code.
> If we end leap seconds, it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
I'm not sure what you're getting at here. If we stopped introducing leap seconds, then why would the legal world still care about them?
I'm honestly amazed to see so many people agree with this.
Timestamps are exactly what we define them to be. There is no correct and incorrect.
One option is to have a system with arbitrary unpredictable leaps to keep it synchronized to within 1 second of the mean solar time over Greenwich, England. Every computer system that has to deal with time accurately needs a lookup table for leap seconds that is occasionally amended, with only a couple months warning in advance.
Another option is to just let the clock run at a constant rate. In this case only astronomers have to keep track of the difference between solar time and clock time (which they already do anyway).
The fact that the difference will increase to an hour after several hundred years is utterly irrelevant. If people in the future care, they can simply adjust the timezone definitions to compensate, since timezones are already adjusted all the time.
When the sun is directly overhead it's meant to be 12:00 - IN THEORY!
However as Timezones are pretty wide, most of the time you'll be at least 15 minutes out. Sometimes you'll be out by as much as 3 hours - and you've probably never even noticed!
Telescopes already have to compensate for this (as well as for summer time).
Leap seconds make a shambles of book keeping too. What is "2022-07-17T12:00:00" + (60 x 60 x 24 x 365 x 5) seconds? No one knows! And the answer to that question will change depending on when you calculate it and which updates you installed!
So I say ditch the leap second and let it drift. In a few hundred years we could update our timezones if we _really_ want to (timezone changing is actually pretty common, so code should already be handling this edge-case).
> In about 600 years TI will be ahead of UT1 by half an hour, and in about 1000 years the difference will be a full hour.
That's nothing. Time zones alone already create significantly larger errors. Belgrade and Sevilla share a time zone, but the solar meridian ("noon" on a sundial) is 12:44 in Belgrade and 14:30 in Sevilla. Obviously, the same error is present in the astronomical "middle of the night". This does not, in fact, create "legal issues" for Serbs or Spaniards.
In 600-1000 years, around the time that it would actually matter, we're going to have to reform the time system anyway to account for relativistic drift between the surface of the Earth and human settlements elsewhere in the solar system.
There's no need to "detach" from UTC. Just ensure that TAI (which is consistently free of leap seconds) is also supported on an equal status to UTC, for applications where it makes the most sense. Conflating the two would only increase confusion further.
We live in a world where civil time moves by an hour 2x a year for no good reason.
You FAR overstate the impact on civil society of failing to change it by a second every so often.
Ironically even astronomers, who leap seconds were originally for, don't benefit because they need to know the Earth's rotation accurately to subsecond levels.
> By 2055, the "minute" displayed on a clock may be incorrect, which again may cause issues with legal timestamps.
I'm not following here. What defines "legal timestamps" in our current system? I'm unaware of any laws in the US that uses the actual position of the sun to determine the time.
"Noon" when the sun is at the highest point, can vary over an hour across a timezone.
“civil time” is also a construction that is flexible in many ways, so an influencial group redefining it isn’t out of norm. To note, timezones were introduced for railway purposes, and some country play a lot with them.
For “midnight” being far from “the middle of the night”, that’s already a reality for many Chinese living far enough from Beijing, or god forbid regions where “night” doesn’t mean much for half of the year.
For all intents and purposes, if a formal definition of time isn’t practical people come up with their own ways.
> it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night"
Honestly from my perspective, 3am is the middle of the night (night-morning-afternoon-evening starts at 0-6-12-18 for me) and somewhere between 4 and 5 most people are probably asleep and the date change should occur. I can't count how often I've heard people clarify what 'tomorrow' means when the word is spoken after "midnight" but before going to sleep.
But yeah gotta pick something for the date change, it won't be worth the cost of change now. If we do end up ever switching to something like decimal time, this should be on the todo list though.
And I know "midnight" is historically supposed to be about the sun being the furthest from its zenith rather than in the middle between when you go to sleep and get up, however that occurs somewhere around 1am here (01:41 at its extreme, from July 17 till August 5th). If that's not enough to warrant a redefinition, 27 seconds accumulated since we started counting leap seconds are also not enough to warrant an update yet (following Facebook's logic here).
* "Most telescope pointing systems fail" (by 2027) (with 5s deviation from earth rotation). Pointing systems cannot blindly rely on UTC anyway, since (a) even with leap seconds UTC is up to 1 second off earth's rotation, and (b) pointing a telescope depends on where the telescope is on earth, so some offset must be added to UTC by some human.
* Hypothesized legal issues... give me a break.
It would be much less trouble for humanity to deal with this once every 100 years or so.
These "problems" are trivial. The day changes at midnight which is 12:00 AM by the clock. There is no ambiguity. Midnight is not literally the middle of the night. The minute on the clock will be correct by definition, nothing will change. Sundials are already wrong. You'll need to try a lot harder to convince me that this is a bad idea.
All these arguments based on sun position make no sense in a world where people already live in places where the sun literally never sets or never rises for months, and people already live in time zones offset many, many hours from "correct" time. The sky doesn't fall!
I don't see how you run into legal problems. The break from one day to the next still occurs at a well defined time, 23:59:59 + 1 second, or 00:00:00. Midnight isn't the middle of the night (or noon exactly at solar zenith), except on 15deg meridians anyway. What will happen is that over time, those "golden" meridians will shift slightly. The only people who will notice are those that are using time for celestial navigation. Terrestial navigation, which is almost entirely done with GPS these days, won't be affected at all (GPS already doesn't use leap seconds). And, yes, sundials will gradually get out of sync, and have eventually to be rotated on their axis to be right.
> only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
I can't follow your logic here. In any relevant context midnight has a definition, typically UTC midnight in the applicable timezone. Eliminating leap-seconds would make the instant midnight occurs less ambiguous in 2028, because precise timing with leap-seconds is strictly harder than without. (and one can independently realize a time that closely follows TAI but one cannot independently realize UT1 without a VLBI radio telescope array, and one can't realize UT1-TAI without a datafeed because the decisions are subjective).
This isn't just a question of 'some lines of code'. Leapseconds cause widespread disruptions even when they don't occur, they cause security vulnerabilities (and slower and less secure systems because they make synchronization unreliable). People are widely deploying "leap smeared" NTP servers to try to prevent some of the worst synchronization faults, but doing so makes it impractical to back out leap seconds to derive TAI (or a more accurate TT) from the system's UTC, particularly because systems don't know if they're leapsmeared or not (and different smear sources use different smearing parameters).
Please consider that none of this actually matters if we ditched UTC for TAI. For one, time zones still exist and local solar time is already decoupled from clock time.
Why does ntpd lose the smear on a restart? I would have thought that the current smear could be calculated purely based off current non-smear time, plus the config to say when to smear, which is presumably available upon restart.
Also, why were non-linear smears thought to be desirable? Googling just turns up hand-wavy phrases like "easier on clients".
I can't quite reconcile the FB attitude of "we only hire the best and brightest after making them demonstrate their technical prowess" vs "computers are a bit hard please can everyone change everything to make it easier?"
Why not just agitate for a move to French Revolutionary Decimal Time as well?
This is not a case of "computers are a bit hard", it's a system that we've imposed upon ourselves that has been repeatedly demonstrated to be unsafe. because of this, we've ended up with many (many) solutions baked into software that attempt to abstract away the sharp edges of this problem from individual engineers, which leads to inconsistent assumptions about what you need to consider when writing code.
even with the best and brightest engineers there is a non-trivial probability that someone will make an assumption that is invalid based on their understanding of what can happen (ex. leap seconds never go negative!) or the library their using (this ensures monotonic time!) that could lead to disastrous results. and especially at meta scale, that probability is no longer "will someone make this mistake in our code?" but is "how many times will people make this mistake in our code?", so systemic solutions that eliminate this as a class of problem an individual can create is something we should consider.
My exact first thought as well, which I will readily admit comes mostly from my bottomless contempt for the company and its employees. The thing that really needs to be left in the past is Facebook.
A large part of the problem is that in software there is a traditional conflation of (a) time marks to measure elapsed time and (b) events where we want to know what their wall-clock date/time is. Those should in principle be kept separate. One can use a monotonic elapsed-time clock for the former and a calendar/wall-clock based clock for the latter. Conversion between the two shouldn’t be done gratuitously, and have to be done with the awareness that the mapping to future wall-clock dates (and sometimes also to past ones) is subject to change. APIs and data types reifying that distinction would be helpful.
This article makes the claim that the calculation of elapsed time is impacted by leap seconds, to the extent that the following value of “elapsed” may even be negative:
This is not correct. Leap seconds are only the concern of how to render time in a human readable way. System time itself always passes one second at a time, and calls to time.now() interspersed with sleep(1) will always go up by one second at a time. It’s just that at some times of year we render that as 23:59:60 and some times we don’t.
But perhaps clock smearing is genuinely changing the way systems measure time, as opposed to smearing how the system renders t=1658818054.0 as hours, minutes and seconds? That seems implausibly incorrect. Lining up the minute hand with the noonday sun on one particular part of the planet should not possibly have any impact on measuring how long it takes to compile my code.
Fun thought, we have only ever had positive leap seconds so far, due to slowing of the Earth's rotation. However, we could have a negative leap second. The rotation has just been consistently slowing since we started caring. But it could speed up again the future. We can predict rotation speed to a certain degree, but not completely. That is why leap seconds are announced only a few months in advance, instead of years.
I don't think we will ever need a negative leap second. We can just wait longer until dispatching the next positive one. The only situation in which we'd need a negative leap second is if Earth's rotation were consistently speeding up over many years. But we can tolerate some wiggle room (as in, several seconds) between UTC and TAI (since it's not in lockstep anyway).
> Google, Microsoft, Meta and Amazon launched a public effort Monday to scrap the leap second, an occasional extra tick that keeps clocks in sync with the Earth's actual rotation. US and French timekeeping authorities concur.
> ... The tech giants and two key agencies agree that it's time to ditch the leap second. Those are the US National Institute of Standards and Technology (NIST) and its French equivalent, the Bureau International de Poids et Mesures (BIPM).
Weird, I came to a different conclusion after reading the article. There's already a graceful solution to non-monotonic time, which mitigates most of the problems: smear, don't leap. Only, it's not a universal solution, so various systems are out of sync during the smear. Solution: petition for a standardized smearing strategy. But yeah, leave "leap" seconds in the past.
And, maybe, don't run sub-second benchmarks with a wallclock.
If smearing were adopted as standard, the "seconds," "minutes," and "hours" appearing in timestamps would no longer correspond to literal seconds minutes, and hours of duration, even in principle. That seems very misleading and bad.
Smearing is absolutely the worst of all possible choices. Instead of one second, you are out of sync with the whole world for all of 24 hours. And you are fooling with things for the whole period.
Of course it was Google who picked the worst of all possible choices.
[+] [-] gmiller123456|3 years ago|reply
Astronomy algorithms usually work in Barycentric Dynamic Time, or Terrestrial Time, or UT1. In reality the whole invention of UTC is so that people don't have to deal with those systems on a daily basis.
The process most astronomy programs go through is to get the UTC from the user, convert the Gregorian date to a Julian day number to get rid of the Gregorian Calendar altogether. Then look up the number of leap seconds, add those to the JD to get International Atomic Time, then add 32.184 seconds to get Terrestrial Time. If Barycentric Dynamic Time is needed, you must first compute the velocity of the Earth relative to the solar system barycenter (which itself requires TDB), then compute the relativistic effects of that motion on Terrestrial Time. If you need UT1, these can only be obtained by observation from the International Earth Rotation Service, and required daily updates, and interpolation of values in between observations.
So, as you can see, leap seconds do nothing to make an astronomer's life easier. In fact, we jump through a lot of hoops to make the average person's life easier. It sounds like Meta wants to change the definition of time for everyone just to make their programming a little easier. I find the very premise just out right ridiculous.
It's true that your average person will not be affected by the Sun setting a few seconds earlier. But eventually it will build up enough error that it eventually has to be addressed, and Meta is just trying to make their problem someone else's problem.
[+] [-] jefftk|3 years ago|reply
[1] https://www.nature.com/articles/nature.2015.18855
[+] [-] 0xCMP|3 years ago|reply
UTC has the leap second cause it's not "real time" and so now we're just gonna never sync up UTC to real time at all? How is that the solution? Either we deal with leap seconds or we need to implement something that can't go backwards and properly models time. Leap seconds seem much simpler...
In the end we didn't get rid of SQL cause of SQL Injection. We fixed the frameworks and promoted the solutions. We may simply need to make a push for languages and etc to just properly support time and promote how to do things correctly. It honestly seems easier.
[+] [-] mritun|3 years ago|reply
Though the ship has sailed, I think it could have been much better if computers were set to follow TAI Time (atomic clock time - unaffected by leap seconds) time than the UTC. UTC is as variable as localtime and should have been treated as such.
If fb wants to - they can (and should) use TAI time for system reference.
TAI : https://www.nist.gov/pml/time-and-frequency-division/nist-ti... edit: link
[+] [-] gmiller123456|3 years ago|reply
Every time I see ditching leap seconds come up, they never try to explain why TAI won't work for them, leading me to believe they probably just don't know it exists, nor could they even imagine something like it being invented.
[+] [-] treesknees|3 years ago|reply
I would have loved to read more justification about _why_ Meta thinks we no longer need the leap second beyond calling it a community push. They did a great job of complaining about how hard it is to solve from a technical perspective, and then explained how they solved it. Is the only problem really that Meta doesn't know how to test a negative leap second?
[+] [-] poppafuze|3 years ago|reply
Meta can simply stop applying a time offset in their reference, use TAI for forensics, and then have a separately calculated time when they need to display in that representation.
...Which is what happens already. System time is seconds since The Unix Epoch. All of those times are calculated and available, and always will be. They chose one of them, didn't like it, and invented a crummy workaround. They could've logged it in TAI and appended the TAI TZ to all their timestamps.
This is being done to work around poor coding choices they made, instead of making the computing fit reality. Basically "We chose smearing, which is a poopstain of tech debt, so we'll fix it by telling the whole world what time it is." That request is loaded with colossal hubris.
They might as well have the second redefined. The real operators do their thing and leave the squawking to those who want to self-identify a poor coders. Because if they don't want to account for a clock that jumps, then they're kicking the can down the road and they want everyone else to join.
Turning it into a Y2K or Y2038 problem is a sad choice of saddling bigger tech debt on the rest of the world.
Calling it mainstream is as much of a narrative as permanent daylight savings time was. The software solutions exist and were deployed (at least throughout Linux and main userspace) by the June 2015 leap.
This is a Facebook problem that they're sloppily handling by pushing it on literally everyone else.
[+] [-] benlivengood|3 years ago|reply
If we want to actually solve the problem then let's switch to an interstellar time standard in a rest frame relative to the CMBR as far outside of gravity wells as possible and make that the universal monotonically increasing standard. Then computers can run on that time standard and UTC and friends can be derivatives.
[+] [-] lavishlatern|3 years ago|reply
One of the largest electronic health records systems has code that predates the UNIX epoch. Much of the time handling code is custom written to deal with this. However, the code was so poorly written that the system would lose data during the double 1 am window that occurs during daylight savings time shift. Hospitals would just shut off all of their computers during this time to deal with it.
As the article notes, issues with leap seconds have also brought down reddit and cloudflare. Many people in this thread are treating this like some sort of display of incompetence, but if you've ever written code that deeply interacts with time, you'd know how difficult it is to get right. A sign of a good system is one where it is difficult to fuck up.
IMO it is better to guarantee that time always moves forward rather than trying to match computer time to human time.
[+] [-] makeworld|3 years ago|reply
Source: https://pkg.go.dev/time#hdr-Monotonic_Clocks
[+] [-] LeoPanthera|3 years ago|reply
This article details some of the problems: https://www.ucolick.org/~sla/leapsecs/dutc.html
(You may want to scroll down to "Implementing the plan outlined at Torino".)
If we end leap seconds, it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
By 2055, the "minute" displayed on a clock may be incorrect, which again may cause issues with legal timestamps.
And by 2083, sundials are measurably wrong.
All because programmers wanted to save some lines of code.
[+] [-] bartread|3 years ago|reply
I agree, but I'm also - sad to say - less than surprised to find engineers at a Big Tech firm taking a high-handed, not to mention narrow and ill-informed, approach over the issue and trying to impose their will on a global scale. My worry here is that, Meta being Meta, they carry quite a lot of influence and may actually gain some traction.
EDIT: I'll add a bit more colour here. At the core of our platform we manage a database containing billions of legacy timestamped records (or events, if you prefer), adding more and more every day. Without even giving it a great deal of thought I guarantee you that this proposal will cause us more problems that it solves and will distract us from making more valuable investments of time and effort that would benefit our business should it be implemented. Sure, we can no doubt fix all these problems, but we've got better things to do. I imagine that many other businesses would be similarly affected and would take a similar view.
I wholeheartedly oppose it.
[+] [-] 3pt14159|3 years ago|reply
This isn't just about lines of missing code. This is about forcing subterranean or submerged computers to surface. This is about out of sync clocks across information propagation networks across planets. This is about real lives that are ruined because time stamps didn't quite line up, causing delays, deaths, and needless headaches.
It doesn't need to be this way. We could just accept a minute of the clock being off from "true" midnight, which doesn't even make sense to me given that few people are right at the astronomic point where midnight is "true" midnight for their timezone. Heck, China is one big giant timezone so who is this actually for, really? The people that care about sundials? Most people don't even grow their own food.
We're no longer a sun-driven economy. Well coordinated timekeeping across devices that may not always be able to transfer data is far, far more important. If it's sufficiently wrong by the year 3422 then we'll deal with the fifteen minutes of annoyance then. This is a crazy premature optimization.
[+] [-] Animats|3 years ago|reply
Yes. Problems with delay time going negative usually come from not using CLOCK_MONOTONIC for delay time. CLOCK_MONOTONIC is usually just the time since system startup. It comes from QNX (which, being hard real time, had to deal with this first), made it into the POSIX spec around 1996, and is now available on all major OSs. But there's still software that uses time of day where CLOCK_MONOTINIC is needed.
Then there's the smoothing approach. This document described Facebook's smoothing approach, which has a different smoothing period than Google uses.
* Facebook/Meta: "We smear the leap second throughout 17 hours, starting at 00:00:00 UTC based on the time zone data (tzdata) package content." This is puzzling. What does the time zone package have to do with UTC?
* Google: 24-hour linear smear from noon to noon UTC.[1]
* AWS: Follows Google.
* US power grid: Starts at midnight UTC and takes a few hours while all the rotating machinery takes 60 extra turns to catch up.
Not sure what telecom is doing.
[1] https://developers.google.com/time/smear
[+] [-] jefftk|3 years ago|reply
Since the article was published we've gone from positive leap seconds every so often to looking like we may get the first ever negative leap second: https://upload.wikimedia.org/wikipedia/commons/f/fb/Leapseco...
Which means the article's estimates on how long it will be until we're off by a given amount of time are very much obsolete.
[1] https://web.archive.org/web/20030801000000*/https://www.ucol...
[+] [-] 1970-01-01|3 years ago|reply
Yes, the actual problem exists, and ignoring/discarding reality (i.e. the "science" in computer science) will just cause further problems. If you and your modern stack of code can't handle the leap second, it's simply not production code.
[+] [-] lesam|3 years ago|reply
Astronomy already ignores leap seconds in the same way that modern finance ignores half-crowns and shillings.
[+] [-] DSMan195276|3 years ago|reply
I'm not sure what you're getting at here. If we stopped introducing leap seconds, then why would the legal world still care about them?
[+] [-] zJlG|3 years ago|reply
Timestamps are exactly what we define them to be. There is no correct and incorrect.
One option is to have a system with arbitrary unpredictable leaps to keep it synchronized to within 1 second of the mean solar time over Greenwich, England. Every computer system that has to deal with time accurately needs a lookup table for leap seconds that is occasionally amended, with only a couple months warning in advance.
Another option is to just let the clock run at a constant rate. In this case only astronomers have to keep track of the difference between solar time and clock time (which they already do anyway).
The fact that the difference will increase to an hour after several hundred years is utterly irrelevant. If people in the future care, they can simply adjust the timezone definitions to compensate, since timezones are already adjusted all the time.
[+] [-] Ellipsis753|3 years ago|reply
[+] [-] philwelch|3 years ago|reply
> In about 600 years TI will be ahead of UT1 by half an hour, and in about 1000 years the difference will be a full hour.
That's nothing. Time zones alone already create significantly larger errors. Belgrade and Sevilla share a time zone, but the solar meridian ("noon" on a sundial) is 12:44 in Belgrade and 14:30 in Sevilla. Obviously, the same error is present in the astronomical "middle of the night". This does not, in fact, create "legal issues" for Serbs or Spaniards.
In 600-1000 years, around the time that it would actually matter, we're going to have to reform the time system anyway to account for relativistic drift between the surface of the Earth and human settlements elsewhere in the solar system.
[+] [-] zozbot234|3 years ago|reply
[+] [-] btilly|3 years ago|reply
You FAR overstate the impact on civil society of failing to change it by a second every so often.
Ironically even astronomers, who leap seconds were originally for, don't benefit because they need to know the Earth's rotation accurately to subsecond levels.
[+] [-] ZetaZero|3 years ago|reply
I'm not following here. What defines "legal timestamps" in our current system? I'm unaware of any laws in the US that uses the actual position of the sun to determine the time.
"Noon" when the sun is at the highest point, can vary over an hour across a timezone.
[+] [-] makeitdouble|3 years ago|reply
For “midnight” being far from “the middle of the night”, that’s already a reality for many Chinese living far enough from Beijing, or god forbid regions where “night” doesn’t mean much for half of the year.
For all intents and purposes, if a formal definition of time isn’t practical people come up with their own ways.
[+] [-] lucb1e|3 years ago|reply
Honestly from my perspective, 3am is the middle of the night (night-morning-afternoon-evening starts at 0-6-12-18 for me) and somewhere between 4 and 5 most people are probably asleep and the date change should occur. I can't count how often I've heard people clarify what 'tomorrow' means when the word is spoken after "midnight" but before going to sleep.
But yeah gotta pick something for the date change, it won't be worth the cost of change now. If we do end up ever switching to something like decimal time, this should be on the todo list though.
And I know "midnight" is historically supposed to be about the sun being the furthest from its zenith rather than in the middle between when you go to sleep and get up, however that occurs somewhere around 1am here (01:41 at its extreme, from July 17 till August 5th). If that's not enough to warrant a redefinition, 27 seconds accumulated since we started counting leap seconds are also not enough to warrant an update yet (following Facebook's logic here).
[+] [-] bhk|3 years ago|reply
That article is ridiculous.
* "Most telescope pointing systems fail" (by 2027) (with 5s deviation from earth rotation). Pointing systems cannot blindly rely on UTC anyway, since (a) even with leap seconds UTC is up to 1 second off earth's rotation, and (b) pointing a telescope depends on where the telescope is on earth, so some offset must be added to UTC by some human.
* Hypothesized legal issues... give me a break.
It would be much less trouble for humanity to deal with this once every 100 years or so.
[+] [-] modeless|3 years ago|reply
All these arguments based on sun position make no sense in a world where people already live in places where the sun literally never sets or never rises for months, and people already live in time zones offset many, many hours from "correct" time. The sky doesn't fall!
[+] [-] BrainVirus|3 years ago|reply
No, it's because programmers don't want basic timekeeping in all devices involve the worst issues of managing a distributed system.
[+] [-] metacritic12|3 years ago|reply
Sundials being "measurably" wrong become an issue at long spans. For example, no one wants solar noon to be at 10AM -- but that takes a while.
[+] [-] walnutclosefarm|3 years ago|reply
[+] [-] nullc|3 years ago|reply
I can't follow your logic here. In any relevant context midnight has a definition, typically UTC midnight in the applicable timezone. Eliminating leap-seconds would make the instant midnight occurs less ambiguous in 2028, because precise timing with leap-seconds is strictly harder than without. (and one can independently realize a time that closely follows TAI but one cannot independently realize UT1 without a VLBI radio telescope array, and one can't realize UT1-TAI without a datafeed because the decisions are subjective).
This isn't just a question of 'some lines of code'. Leapseconds cause widespread disruptions even when they don't occur, they cause security vulnerabilities (and slower and less secure systems because they make synchronization unreliable). People are widely deploying "leap smeared" NTP servers to try to prevent some of the worst synchronization faults, but doing so makes it impractical to back out leap seconds to derive TAI (or a more accurate TT) from the system's UTC, particularly because systems don't know if they're leapsmeared or not (and different smear sources use different smearing parameters).
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] prpl|3 years ago|reply
The worst case scenario is things are different.
[+] [-] jheitmann|3 years ago|reply
Also, why were non-linear smears thought to be desirable? Googling just turns up hand-wavy phrases like "easier on clients".
[+] [-] edent|3 years ago|reply
Why not just agitate for a move to French Revolutionary Decimal Time as well?
[+] [-] dastbe|3 years ago|reply
even with the best and brightest engineers there is a non-trivial probability that someone will make an assumption that is invalid based on their understanding of what can happen (ex. leap seconds never go negative!) or the library their using (this ensures monotonic time!) that could lead to disastrous results. and especially at meta scale, that probability is no longer "will someone make this mistake in our code?" but is "how many times will people make this mistake in our code?", so systemic solutions that eliminate this as a class of problem an individual can create is something we should consider.
[+] [-] alecbz|3 years ago|reply
[+] [-] russellbeattie|3 years ago|reply
[+] [-] layer8|3 years ago|reply
[+] [-] gorgoiler|3 years ago|reply
But perhaps clock smearing is genuinely changing the way systems measure time, as opposed to smearing how the system renders t=1658818054.0 as hours, minutes and seconds? That seems implausibly incorrect. Lining up the minute hand with the noonday sun on one particular part of the planet should not possibly have any impact on measuring how long it takes to compile my code.
[+] [-] dexwiz|3 years ago|reply
https://www.timeanddate.com/time/negative-leap-second.html#:....
[+] [-] kortex|3 years ago|reply
[+] [-] kstrauser|3 years ago|reply
The hubris of “we’d rather not do this, so let’s make the entire rest of the world deal with it instead” is impressive.
[+] [-] dmm|3 years ago|reply
[+] [-] neuronexmachina|3 years ago|reply
> Google, Microsoft, Meta and Amazon launched a public effort Monday to scrap the leap second, an occasional extra tick that keeps clocks in sync with the Earth's actual rotation. US and French timekeeping authorities concur.
> ... The tech giants and two key agencies agree that it's time to ditch the leap second. Those are the US National Institute of Standards and Technology (NIST) and its French equivalent, the Bureau International de Poids et Mesures (BIPM).
[+] [-] klyrs|3 years ago|reply
And, maybe, don't run sub-second benchmarks with a wallclock.
[+] [-] daine|3 years ago|reply
[+] [-] ncmncm|3 years ago|reply
Of course it was Google who picked the worst of all possible choices.