The months-start-at-0 is a legacy of POSIX's datetime API, as is the year field being a count from 1900. Java continued these mistakes, and then added the everything-defaults-to-local-tz functionality (POSIX had different functions for UTC versus local tz), and JS copied Java's API in this respect without modification.
Java eventually torched its date/time API not once, but twice: the JDK 1.1 addition of Calendar that deprecated most of the naïve methods on Date, and JDK 8 adding java.time.*, which is roughly the modern model of date/time APIs.
I think months from 0 in POSIX were intentional, not mistakes. The things you might want to do with an integer month number include:
1. Look up some attribute or property of the month in a table (number of days in the month in a non-leap year, name of the month, list of fixed holidays in the month, etc).
2. Determine if the month is before or after a different month.
3. Determine how many months one month is after or before another month.
4. Determine which month is N months before/after a given month.
5. Print the month number for a human.
In languages that use 0-based indexing:
#1 is more convenient with 0-based months.
#2-4 only need that the month numbers form an arithmetic progression, so are indifferent to where we start counting.
#5 is more convenient with 1-based months.
So it comes down to is it better to have your low level date functions require that you have to remember to subtract 1 whenever you do something from #1, or to add 1 whenever you do something from #5?
I'd expect most programs do a lot more #1 than #5, so the code is going to be cleaner if you go with 0-based months.
I think it's basically tradition at this point that languages make a hash of their date/time abstractions, maybe getting them right after a half dozen iterations.
[EDIT] Realised this is the behavior I would expect. Parse is doing what I would expect it to, taking into account the given time being UTC. It's then returning an object that's in the local timezone. Still goes to show just how confusing this datetime stuff can be when even the expected behavior looks wrong.
Wait… how is this not a bug according to both Microsoft's own spec[0] and ISO 8601[1]? The Z specifically means this time is in UTC.
The behavior is not at all what I would expect from reading the docs [0]:
> A string that includes time zone information and conforms to ISO 8601. In the following examples, the first string designates Coordinated Universal Time (UTC), and the second designates the time in a time zone that's seven hours earlier than UTC:
> "2008-11-01T19:35:00.0000000Z"
> "2008-11-01T19:35:00.0000000-07:00"
[EDIT] I get it now: it's parsing it right, it's just that it's then putting it into a datetime object that's in the local timezone – which is what I would expect. The alternative would be counterintuitive to me.
Of course this is avoided by using getFullYear(), but I've always wondered why a language that came out in 1995 had a function that returned two-digit years.
> - Converting from string datetime to a datetime object will automatically convert the time to the local tz
I think this makes sense in the context of client side javascript whose sole point is to present UI to the user. The dates aren't necessarily converted to local TZ (the date will always remain in the same and fixed point in time), but rather any output is local to the timezone.
I think this is more good than bad IMHO, given the intention of JS is to build (or augment) UIs. It's just not great that it's hard to do anything other than that.
jcranmer|4 years ago
Java eventually torched its date/time API not once, but twice: the JDK 1.1 addition of Calendar that deprecated most of the naïve methods on Date, and JDK 8 adding java.time.*, which is roughly the modern model of date/time APIs.
anamexis|4 years ago
tzs|4 years ago
1. Look up some attribute or property of the month in a table (number of days in the month in a non-leap year, name of the month, list of fixed holidays in the month, etc).
2. Determine if the month is before or after a different month.
3. Determine how many months one month is after or before another month.
4. Determine which month is N months before/after a given month.
5. Print the month number for a human.
In languages that use 0-based indexing:
#1 is more convenient with 0-based months.
#2-4 only need that the month numbers form an arithmetic progression, so are indifferent to where we start counting.
#5 is more convenient with 1-based months.
So it comes down to is it better to have your low level date functions require that you have to remember to subtract 1 whenever you do something from #1, or to add 1 whenever you do something from #5?
I'd expect most programs do a lot more #1 than #5, so the code is going to be cleaner if you go with 0-based months.
cbsmith|4 years ago
k1t|4 years ago
This happens in C# too:
Outputs 5 instead of 12 (for me, US Pacific) due to a default conversion to local time. It does have an opt-in flag to "adjust to UTC" though.Not intuitive at all, at least to me...
dpwm|4 years ago
Wait… how is this not a bug according to both Microsoft's own spec[0] and ISO 8601[1]? The Z specifically means this time is in UTC.
The behavior is not at all what I would expect from reading the docs [0]:
> A string that includes time zone information and conforms to ISO 8601. In the following examples, the first string designates Coordinated Universal Time (UTC), and the second designates the time in a time zone that's seven hours earlier than UTC:
> "2008-11-01T19:35:00.0000000Z"
> "2008-11-01T19:35:00.0000000-07:00"
[EDIT] I get it now: it's parsing it right, it's just that it's then putting it into a datetime object that's in the local timezone – which is what I would expect. The alternative would be counterintuitive to me.
[0] https://docs.microsoft.com/en-us/dotnet/api/system.datetime....
[1] https://en.wikipedia.org/wiki/ISO_8601#Coordinated_Universal...
profmonocle|4 years ago
dbbk|4 years ago
chedabob|4 years ago
madeofpalk|4 years ago
I think this makes sense in the context of client side javascript whose sole point is to present UI to the user. The dates aren't necessarily converted to local TZ (the date will always remain in the same and fixed point in time), but rather any output is local to the timezone.
I think this is more good than bad IMHO, given the intention of JS is to build (or augment) UIs. It's just not great that it's hard to do anything other than that.