top | item 46963858

(no title)

d0liver | 19 days ago

I think, more generally, "push effects to the edges" which includes validation effects like reporting errors or crashing the program. If you, hypothetically, kept all of your runtime data in a big blob, but validated its structure right when you created it, then you could pass around that blob as an opaque representation. You could then later deserialize that blob and use it and everything would still be fine -- you'd just be carrying around the validation as a precondition rather than explicitly creating another representation for it. You could even use phantom types to carry around some of the semantics of your preconditions.

Point being: I think the rule is slightly more general, although this explanation is probably more intuitive.

discuss

order

jmull|19 days ago

Systems tend to change over time (and distributed nodes of a system don’t cut over all at once). So what was valid when you serialized it may not be valid when you deserialize it later.

d0liver|19 days ago

This issue exists with the parsed case, too. If you're using a database to store data, then the lifecycle of that data is in question as soon as it's used outside of a transaction.

We know that external systems provide certain guarantees, and we rely on them and reason about them, but we unfortunately cannot shove all of our reasoning into the type system.

Indeed, under the hood, everything _is_ just a big blob that gets passed around and referenced, and the compiler is also just a system that enforces preconditions about that data.