Given that 100 000 000 seconds is approximately 3 years 2 months, we are going to see an event like this every few years.
I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!
> I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!
Egads! 33 years! I spent my late 90:ies mudding[0] and for some reason we had a lot of save files named by their epoch timestamp. When I ended up responsible for parts of the code base, I spent a lot of time dealing with those files, and they were all in the 800- or 900- million range. At some point I was pretty much able to tell at a glance roughly what date any number in that range corresponded to, within perhaps a few weeks.
I remember staying up late to see the tick to over from 999,999,999 to 1 billion, thinking "I'll remember this week my whole life". Little did I realise how 60 hours later the whole world would remember.
Timestamp 1000000000 (Sat 2001-09-08 18:46:40 PDT) triggered a bug in the bug reporting system (Remedy) we were using at the time.
The system stored timestamps as a string representing seconds since the epoch, but it assumed it would fit in 9 digits. At 1000000000, it started dropping the last digit, so it went back to Sat 1973-03-03 01:46:40 PST, advancing at 10% of real time. It was fixed fairly quickly.
I didn't remember it was so close but those were the days I obsessively read Slashdot, which helped during 9/11, and which certainly covered the epoch event.
I was doing the late shift on a trading floor at a big bank.
The head of the derivatives tech support team pointed out it was about to hit so we opened up a shell and did a "watch" command + outputting the "date" command in epoch seconds and watched it happen.
I remember that moment! I was out at a bar or something at the time but I was prepared and had my laptop with me haha. I was mashing the up arrow and enter to make sure I didn’t miss it.
One of my favorite bits of Vinge's A Deepness in the Sky is the use of base-10 time: ksec, Msec, etc. There is a nice time log scale with Earth time to base-10 time conversions.
Yes! It is as a direct result of that book that I now know without having to look it up that a ksec is about a quarter hour and a Msec is on the order of a fortnight, which comes in handy when doing back-of-envelope estimation more often than you'd expect. (I'd already known that a Gsec was about a third of a century thanks to Tom Duff's observation.[0]) I don't see us moving to such a system anytime soon in general (tying to the circadian cycle is just too convenient) but I'm a little surprised I don't see it more often in discussions of humans in space.
[0] "How many seconds are there in a year? If I tell you there are 3.155 x 10^7, you won't even try to remember it. On the other hand, who could forget that, to within half a percent, pi seconds is a nanocentury." --Tom Duff
I love that others get excited about this. UNIX Timeval Aficionados should try out this tf tool [1]. I used my buddy's C/Lex/Yacc one daily for 1.5 decades, then ported it to Golang + Homebrew to share the love:
Instant bookmark for me. I've always loved the idea of measuring time in computers by a single integer like the timestamp does, but it always seems like such a pain to work with outside of that.
Because the bases are all wrong. Common number bases are 10, 16, maybe 8 if you live in the 70s, and 2.
Except for the utterly unwieldy binary, none of those bases adapt well to the bases used in representing time, which are mostly the (partially related) bases 60, 12, and, annoyingly, thirty-ish.
So you always end up doing opaque arithmetic instead of “just looking at the digits” (which you still can do in decimal for century vs years for example, because we defined centuries to be exactly that).
>The Unix epoch is midnight on January 1, 1970. It's important to remember that this isn't Unix's "birthday" -- rough versions of the operating system were around in the 1960s. Instead, the date was programmed into the system sometime in the early 70s only because it was convenient to do so, according to Dennis Ritchie, one the engineers who worked on Unix at Bell Labs at its inception.
>"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any."
Yesterday, I was digging into some stuff in the database and saw some events scheduled for 17*. My initial reaction was that it was some far-off date. Then I realized ... nope, not far away at all.
There's a lot of epoch love in the comments. For me, it's never "clicked". I assumed that after seeing a ton of timestamps that I'd have a Neo-seeing-the matrix moment with timestamps but it just hasn't happened. Can you all easily decode them?
Is there talk anywhere of using a human-readable timestamp instead? e.g. YYYYMMddHHmmssSSSSZ
Sure there is. But since it is not a continuous range there are the fixed separators --T::. between the parts. It is the javascript time format, which is a subset of the RFC3339 and ISO8601 time formats. The separators help at least to allow for a variable amount of sub-second digits.
perihelions|2 years ago
Edit: here was the front page of the New York Times at 1600000034,
https://web.archive.org/web/20200913122714/https://www.nytim...
and here's 1500000301 and 1400000634, and 1300007806
https://web.archive.org/web/20170714024501/http://www.nytime...
https://web.archive.org/web/20140513170354/http://www.nytime...
https://web.archive.org/web/20110313091646/http://www.nytime...
benatkin|2 years ago
susam|2 years ago
My own blog post here commemorating the event: https://susam.net/maze/unix-timestamp-1600000000.html
Given that 100 000 000 seconds is approximately 3 years 2 months, we are going to see an event like this every few years.
I believe the most spectacular event is going to be the Unix timestamp 2 000 000 000 which is still 9½ years away: 2033-05-18 03:33:20 UTC. Such an event occurs only once every 33 years 8 months approximately!
By the way, here's 1700000000 on Python:
GNU date (Linux): BSD date (macOS, FreeBSD, OpenBSD, etc.):scbrg|2 years ago
Egads! 33 years! I spent my late 90:ies mudding[0] and for some reason we had a lot of save files named by their epoch timestamp. When I ended up responsible for parts of the code base, I spent a lot of time dealing with those files, and they were all in the 800- or 900- million range. At some point I was pretty much able to tell at a glance roughly what date any number in that range corresponded to, within perhaps a few weeks.
Weird environments foster weird super powers.
[0] https://en.wikipedia.org/wiki/Multi-user_dungeon
midasuni|2 years ago
reliablereason|2 years ago
I might even get to experience 3333333333 if I am lucky. What a day, what a day, yes indeed!
RedCinnabar|2 years ago
xavdid|2 years ago
function_seven|2 years ago
“Boss! We’re being dee dossed!”
“No, son, it’s Tuesday”
SLWW|2 years ago
I wonder if you can find a shirt that would print that
helsinki|2 years ago
pests|2 years ago
mi_lk|2 years ago
ta1243|2 years ago
_kst_|2 years ago
The system stored timestamps as a string representing seconds since the epoch, but it assumed it would fit in 9 digits. At 1000000000, it started dropping the last digit, so it went back to Sat 1973-03-03 01:46:40 PST, advancing at 10% of real time. It was fixed fairly quickly.
mongol|2 years ago
petrikapu|2 years ago
shizcakes|2 years ago
alexpotato|2 years ago
The head of the derivatives tech support team pointed out it was about to hit so we opened up a shell and did a "watch" command + outputting the "date" command in epoch seconds and watched it happen.
Then we went back to working.
cryptoz|2 years ago
clarkmoody|2 years ago
blahedo|2 years ago
[0] "How many seconds are there in a year? If I tell you there are 3.155 x 10^7, you won't even try to remember it. On the other hand, who could forget that, to within half a percent, pi seconds is a nanocentury." --Tom Duff
cpeterso|2 years ago
NoMoreNicksLeft|2 years ago
russellbeattie|2 years ago
Assuming I live that long, the next day will be my 65th birthday. Just in time for digital Armageddon.
diego_sandoval|2 years ago
jrockway|2 years ago
neomantra|2 years ago
[1] https://github.com/neomantra/tf
Printing out these round ones. `tf` auto-detects at 10-digits, so I started there in the `seq`. Some funny dates. -g detects multiple on a line, -d includes the date: Enjoy... may it save you time figuring out time!xyproto|2 years ago
It's a special day, since the next round UNIX day is 30000, at 2052-02-20.
https://github.com/xyproto/ud/
mgdlbp|2 years ago
http://www.df7cb.de/projects/sdate/
one commit message for the QDBs:
Sep 17 2001 (1000684800) is a special date from git-format-patch. Its significance is lost to time.ksaj|2 years ago
@ date -d '@1800000000' Fri Jan 15 03:00:00 AM EST 2027
msavio|2 years ago
hiAndrewQuinn|2 years ago
bloopernova|2 years ago
3np|2 years ago
anyfoo|2 years ago
Except for the utterly unwieldy binary, none of those bases adapt well to the bases used in representing time, which are mostly the (partially related) bases 60, 12, and, annoyingly, thirty-ish.
So you always end up doing opaque arithmetic instead of “just looking at the digits” (which you still can do in decimal for century vs years for example, because we defined centuries to be exactly that).
PrimeMcFly|2 years ago
Why?
SirMaster|2 years ago
Was it being used in 1970 and actually started at 0?
Or did they just pick a date to start it and if so what was the initial Unix time when it was first used?
leonidasv|2 years ago
>"At the time we didn't have tapes and we had a couple of file-systems running and we kept changing the origin of time," he said. "So finally we said, 'Let's pick one thing that's not going to overflow for a while.' 1970 seemed to be as good as any."
https://www.wired.com/2001/09/unix-tick-tocks-to-a-billion/
neogodless|2 years ago
withinboredom|2 years ago
wolfi1|2 years ago
kevinbowman|2 years ago
gpvos|2 years ago
Ayesh|2 years ago
jefftk|2 years ago
asplake|2 years ago
hallman76|2 years ago
Is there talk anywhere of using a human-readable timestamp instead? e.g. YYYYMMddHHmmssSSSSZ
stkdump|2 years ago
unknown|2 years ago
[deleted]
unknown|2 years ago
[deleted]
unknown|2 years ago
[deleted]