(no title)
krick | 21 days ago
I mean, of course having a string, when you mean "email" or "date" is only slightly better than having a pointer, when you mean a string. And everyone's instinctive reaction to that should be that it's horrible. In practice though, not only did I often treat some complex business-objects and emails as strings, but (hold onto yourselves!) even dates as strings, and am ready to defend that as the correct choice.
Ultimately, it's about how much we are ready to assume about the data. I mean, that's what modelling is: making a set of assumptions about the real world and rejecting everything that doesn't fit our model. Making a neat little model is what every programmer wants. It's the "type-driven design" the OP praises. It's beautiful, and programmers must make beautiful models and write beautiful code, otherwise they are bad programmers.
Except, unfortunately, programming has nothing to do with beauty, it's about making some system that gets some data from here, displays it there and makes it possible for people and robots to act on the given data. Beautiful model is essentially only needed for us to contain the complexity of that system into something we can understand and keep working. The model doesn't truly need t be complete.
Moreover, as everyone with 5+ years of experience must known (I imagine), our models are never complete, it always turns out that assumptions we make are naïve it best. It turns out there was time before 1970, there are leap seconds, time zones, DST, which is up to minutes, not hours, and it doesn't necessarily happen on the same date every year (at least not in terms of Gregorian calendar, it may be bound to Ramadan, for example). There are so many details about the real world that you, brave young 14 (or 40) year old programmer don't know yet!
So, when you model data "correctly" and turn "2026-02-10 12:00" (or better yet, "10/02/2026 12:00") into a "correct" DateTime object, you are making a hell lot of assumptions, and some of them, I assure you, are wrong. Hopefully, it just so happens that it doesn't matter in your case, this is why such modelling works at all.
But what if it does? What if it's the datetime on a ticket that a third party provided to you, and you are providing it to a customer now? And you get sued if it ends up the wrong date because of some transformations that happened inside of your system? Well, it's best if it doesn't happen. Fortunately, no other computations in the system seem to rely on the fact it's a datetime right now, so you can just treat it as a string. Is it UTC? Event city timezone? Vendor HQ city timezone? I don't know! I don't care! That's what was on the ticket, and it's up to you, dear customer, to get it right.
So, ultimately, it's about where you are willing to put the boundary between your model and scary outer world, and, pragmatically, it's often better NOT to do any "type-driven design" unless you need to.
lelanthran|20 days ago
I think that's the benefit of strong typing: when you find an assumption is wrong, you fix it in a single place (in this example, the DateTime object).
If your datetime values are stored as strings everywhere in your code:
a) You are going to have a bad day trying to fix a broken assumption in every place storing/using a datetime, and
b) Your wrong assumptions are still baked in, except now you don't have a single place to fix it.
krick|19 days ago
And also, second, this is more specific to this particular example, but when we say "DateTime object" we usually mean "your programming language stdlib DateTime object". Or at least "some popular library DateTime object". Not your "home-baked DateTime object". And I've yet to see a language where this object makes only correct assumptions about real-life datetimes (even only as far, as my own current knowledge about datetime goes, which almost certainly still isn't complete!). And you'd think datetimes are trivial compared to the rest of objects in our systems. I mean, seriously, it's annoying, but I have to make working software somehow, despite the backbone of all of our software being just shit, and not relying on this shit more than I need to is a good rule to follow. Sure, I totally can use whatever broken DateTime objects when the correctness is not that important (they still work for like 99% of use-cases), but when correctness is important, I'd better rely on a string (maybe wrapped as NewType('SpecialDate', str)) that I know won't modify itself, than on stdlib DateTime object.