top | item 43971156

(no title)

thasso | 9 months ago

I agree that zero-initializing doesn't really help avoid incorrect values (which is what the author focuses on) but at least you don't have UB. This is the main selling point IMO.

discuss

order

yusina|9 months ago

Then why not just require explicit initialization? If "performance" is your answer then adding extra optimization capabilities to the compiler that detects 0 init would be a solution which could skip any writes if the allocator guarantees 0 initialization of allocated memory. A much safer alternative. Replacing one implicit behavior with another is hardly a huge success...

layer8|9 months ago

Operating systems usually initialize new memory pages to zero by default, for security reasons, so that a process can’t read another process’s old data. So this gives you zero-initialization “for free” in many cases. Even when the in-process allocator has to zero out a memory block upon allocation, this is generally more efficient than the corresponding custom data-type-specific default initialization.

If you have a sparse array of values (it might be structs), then you can use a zero value to mark an entry that isn’t currently in use, without the overhead of having to (re-)initialize the whole array up-front. In particular if it’s only one byte per array element that would need to be initialized as a marker, but the compiler would force you to initialize the complete array elements.

Similarly, there are often cases where a significant part of a struct typically remains set to its default values. If those are zero, which is commonly the case (or commonly can be made the case), then you can save a significant amount of extra write operations.

Furthermore, it also allows flexibility with algorithms that lazy-initialize the memory. An algorithm may be guaranteed to always end up initializing all of its memory, but the compiler would have no chance to determine this statically. So you’d have to perform a dummy initialization up-front just to silence the compiler.

90s_dev|9 months ago

I'd guess it was because 0 init is desired often enough that this is a convenient implicit default?

bobbylarrybobby|9 months ago

If you zero initialize a pointer and then dereference it as if it were properly initialized, isn't that UB?

jerf|9 months ago

It is undefined behavior in C. In many languages it is defined behavior; for instance in Go, dereferencing a nil pointer explicitly panics, which is a well-defined operation. It may, of course, crash your program, and the whole topic of 'should pointers even be able to be nil?' is a valid separate other question, but given that they exist, the operation of dereferencing a nil pointer is not undefined behavior in Go.

To many people reading this this may be a "duh" but I find it is worth pointing out, because there are still some programmers who believe that C is somehow the "default" or "real" language of a computer and that everything about C is true of other languages, but that is not the case. Undefined behavior in C is undefined in C, specifically. Try to avoid taking ideas about UB out of C, and to the extent that they are related (which slowly but surely decreases over time), C++. It's the language, not the hardware that is defining UB.