top | item 22228595

(no title)

pushpop | 6 years ago

I’d go further than you and say they should be removed by default on all fields.

Want to know if a Boolean field is unset? Well it’s no longer Boolean because you now have 3 states for that field. So why not use a char, U/Y/N with the default being U?

NULL might have made more sense 30+ years ago when systems were more resource constrained but it doesn’t make sense now for most of the problems people are trying to solve day to day. If anything, it creates more problems.

Just to be clear, I’m not saying they should be removed entirely but rather that they shouldn’t have to be explicitly disabled on every CREATE.

I will say one use case for NULL that is hugely helpful is outer joins and nested queries. However these don’t generate high performance queries so if you’re having to rely on them then you might need to rethink your database schema anyway.

So essentially I don’t disagree with you, I just think you’re being too nice limiting your complaint to string fields.

discuss

order

williamdclt|6 years ago

> Well it’s no longer Boolean because you now have 3 states for that field. So why not use a char, U/Y/N with the default being U?

Well because instead of using a type that exactly encodes the concept of "yes/no/unset" (nullable boolean), you'd be using a type that encodes "any 1-character text, with arbitrary meaning and most of the values being nonsensical"

pushpop|6 years ago

The problem is you need a boat load of additional code to cover unset. Not just in SQL (syntax is different for NULL than it is for comparing any other type) but often also in your importing language too (eg some languages will cast NULL to a nil value that can actually raise exceptions or even crash your application if not handled correctly).

Capturing those edge cases is non-trivial compared checking the value of a char.

In an idea world your unit tests and CI pipelines would catch all of those but that depends on well written tests. Thus in my experience having fewer hidden traps from the outset is automatically a better design than one that perfectly fits an academic theory but is harder to support in practice.

setr|6 years ago

It'd probably be more sane than trying to stuff a 3VL into bunch of 2VL operations, because you refuse to acknowledge that you don't actually have a 2VL type