top | item 47045918

(no title)

atiedebee | 13 days ago

ISO C99 actually defines multiple types of deviating behaviour. What you're describing is closer to implementation-defined behaviour than anything else.

The three behaviours relevant in this discussion, from section 3.4:

  3.4.1 implementation-defined behavior
  unspecified behavior where each implementation documents how the choice is made
  EXAMPLE An example of implementation-defined behavior is the propagation of the high-order bit when a signed integer is shifted right.

  3.4.3 undefined behavior
  behavior, upon use of a nonportable or erroneous program construct or of erroneous data, for which this International Standard imposes no requirements
  Possible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message).
  An example of undefined behavior is the behavior on integer overflow.

  3.4.4 unspecified behavior
  behavior where this International Standard provides two or more possibilities and imposes no further requirements on which is chosen in any instance
  An example of unspecified behavior is the order in which the arguments to a function are evaluated.
K&R seems to also mention "undefined" and "implementation-defined" behaviour on several occasions. It doesn't specify what is meant by undefined behaviour, but it does indeed seem to be "whatever happens, happens" instead of "you can do whatever you want." But ISO C99 seems to be a lot looser with their definition.

Using integer overflow, as in your example, for optimization has been shown to be beneficial by Charles Carruth in a talk he did at CppCon in 2016.[1] I think it would be best to have something similar to Zig's wrapping and saturating addition operators instead, but for that I think it is better to just use Zig (which I personally am very willing to do once they reach 1.0 and other compiler implementations are available).[2]

[1] https://youtu.be/yG1OZ69H_-o?si=x-9ALB8JGn5Qdjx_&t=2357 [2] https://ziglang.org/documentation/0.15.2/#Operators

discuss

order

nananana9|13 days ago

[1] is probably the best counterpoint I've seen, but there are other ways to enable this optimization - the most obvious being to use a register-sized index, which is what's passed to the function anyways. I'd be fine with an intrinsic for this as well (I don't think you'll use it often enough to justify the +%= syntax)

It's also worth noting that even with the current very liberal handling of UB, the actual code sample in [1] was still lacking this optimization; so it's not like the liberal UB handling automatically lead to faster code, understanding of the compiler was still needed.

The question is one of risk - if the compiler is conservative, you're risking is a slightly more unoptimized code. If the compiler is very liberal and assumes UB never happens, you're risking that it will wipe your overflow check like in my godbolt (I've seen an actual CVEs due to that, although I don't remember the project)