top | item 41422150

Undefined behavior in C is a reading error (2021)

25 points| nequo | 1 year ago |yodaiken.com

67 comments

order

layer8|1 year ago

TFA is misunderstanding. As he cites from the C standard, “Undefined behavior gives the implementor license not to catch certain program errors that are difficult to diagnose.” Since it’s difficult (and even, in the general case of runtime conditions, impossible at compile time) to diagnose, the implementor (compiler writer) has two choices: (a) assume that the undefined behavior doesn’t occur, and implement optimizations under that assumption, or (b) nevertheless implement a defined behavior for it, which in many cases amounts to a pessimization. Given that competition between compilers is driven by benchmarks, guess which option compiler writers are choosing.

The discussions in comp.lang.c (a Usenet newsgroup, not a mailing list) were educating C programmers that they can’t rely on (b) in portable C, and moreover, can’t make any assumptions about undefined behavior in portable C, because the C specification (the standard) explicitly refrains from imposing any requirements whatsoever on the C implementation in that case.

The additional thing to understand is that compiler writers are not malevolently detecting undefined behavior and then inserting optimizations in that case, but instead that applying optimizations is a process of logical deduction within the compiler, and that it is the lack of assumptions related to undefined behavior being put into the compiler implementation, that is leading to surprising consequences if undefined behavior actually occurs. This is also the reason why undefined behavior can affect code executing prior to the occurrence of the undefined condition, because logical deduction as performed by the compiler is not restricted to the forward direction of control flow (and also because compilers reorder code as a consequence of their analysis).

nothrabannosir|1 year ago

> This is also the reason why undefined behavior can affect code executing prior to the occurrence of the undefined condition, because logical deduction as performed by the compiler is not restricted to the forward direction of control flow (and also because compilers reorder code as a consequence of their analysis).

According to Martin Uecker, of the C standard comittee, that is not true:

> In C, undefined behavior can not time travel. This was never supported by the wording and we clarified this in C23.

https://news.ycombinator.com/item?id=40790203

SAI_Peregrinus|1 year ago

> implementor (compiler writer) has two choices: (a) assume that the undefined behavior doesn’t occur, and implement optimizations under that assumption, or (b) nevertheless implement a defined behavior for it, which in many cases amounts to a pessimization.

No, they have two choices: (a) assume that the undefined behavior doesn't occur, and implement any output code generation whatsoever under that assumption, or (b) define a behavior for it, and implement output code generation based on that assumption which in many cases amounts to a pessimization.

Optimization isn't relevant. Assuming it can't happen and then continuing to generate code as though it can't happen is all that matters. You can't make any assumptions, including that disabling optimization will change the output code.

torstenvl|1 year ago

> the implementor (compiler writer) has two choices: (a) assume that the undefined behavior doesn’t occur, and implement optimizations under that assumption, or (b) nevertheless implement a defined behavior for it, which in many cases amounts to a pessimization.

No. The implementor has three choices: (1) Ignore the situation altogether; (2) behave according to documentation (with or without a warning); or (3) issue an error and stop compilation.

Consider

    for (int i=0; i>=0; i++);
(1) Doesn't attempt to detect UB, it just ignores UB and generates the straightforward translation

          mov 0, %l0
    loop: cmp %l0, 0
          bge loop
            add %l0, 1, %l0 ! Delay slot
          ! Continue with rest of program
(2) May detect that would result in integer overflow and do something it documents (like a trap instruction, or run the whole loop, or elide the whole loop).

(3) Detects that would result in integer overflow and stops compilation with an error message.

An expressio unius interpretation—or simply following the well-worn principle of construing ambiguity against the drafter—would not permit crazy things with UB that many current compilers do.

Ericson2314|1 year ago

I really don't like reading these dimwitted screeds. We did not get here because layering over a standard or a document --- this is not the US supreme court or similar. We got here because

- There are legit issues trying to define everything without loosing portability. This affects C and anything like it.

- Compiler writes do want to write optimizations regardless of whether this is C or anything else --- witness that GCC / LLVM will use the same optimizations regardless of the input language / compiler frontend

- Almost nobody in this space, neither the cranky programmers against, or the normy compiler writers for, has a good grasp of modern logic and proof theory, which is needed to make this stuff precise.

Ericson2314|1 year ago

*lawyering over a standard or document

kragen|1 year ago

> I really don't like reading these dimwitted screeds.

this 'dimwitted screed' is by the primary author of rtlinux, which was to my knowledge the first instance of running linux under a hypervisor, and the leader of the small team that ported linux to the powerpc in the 90s. he has also written a highly cited paper on priority inheritance. if you disagree with him, it is probably for some reason other than his dimwittedness

i can't specifically testify to his knowledge of modern proof theory, but his dissertation was on 'a modal arithmetic for reasoning about multilevel systems of finite state machines', and his recent preprints include 'standard automata theory and process algebra' https://arxiv.org/abs/2205.03515 (no citations), 'understanding paxos and other distributed consensus algorithms' https://arxiv.org/abs/2202.06348 (one citation), and 'the meaning of concurrent programs' https://arxiv.org/abs/0810.1316 (draft, no citations), so i wouldn't bet too much against it

i'm interested to hear what you've written on modern logic and proof theory to understand your perspective better

nickelpro|1 year ago

If you want slow code that behaves exactly the way you expect, turn off optimizations. Congrats, you have the loose assembly wrapper language you always wanted C to be.

For the rest of us, we're going to keep getting every last drop out of performance that we can wring out of the compiler. I do not want my compiler to produce an "obvious" or "reasonable" interpretation of my code, I want it to produce the fastest possible "as if" behavior for what I described within the bounds of the standard.

If I went outside the bounds of the standard, that's my problem, not the compiler's.

krackers|1 year ago

Just like -ffast-math, that should be an opt-in flag. I'd bet most people want (and even expect) the compiler to do the sane thing, especially if it only costs them 2% performance. The quest for mythical "performance" over correctness is precisely why we've landed into this situation in the first place.

Filligree|1 year ago

For the rest of us, actually we will be using different languages that don't pretend UB is what we want. I'd be using C a lot more if I could possibly trust myself to write correct C.

brudgers|1 year ago

A consensus standard happens by multiple stakeholders sitting down and agreeing on what everyone will do the same way. And agreeing one what they won't all do the same way. The things they agree to doing differently don't become part of the standard.

With compilers, different companies usually do things differently. That was the case with C87. The things they talked about but could not or would not agree to do the same way are listed as undefined behaviors. The things everyone agreed to do the same way are the standard.

The consensus process reflects stakeholder interests. Stakeholders can afford to rewrite some parts of their compilers to comply with the standards and cannot afford to rewrite other parts to comply with the standards because their customers rely on the existing implementation and/or because of core design decisions.

kragen|1 year ago

the main stakeholders are c programmers and the users of their programs, not c compiler vendors. the stake held by c compiler vendors is quite small by comparison. however, the c standards committee consists entirely of c compiler vendors, as you implicitly acknowledge by referring to 'their compilers' and 'their customers'. this largely happens through the same process through which drug regulations are written by drug companies and energy regulations are written by oil companies: the c compiler vendors have much deeper knowledge of the subject matter; the standard is put into practice by what the c compiler vendors choose to do; and, although the c compiler vendors' interests in the c standard are vastly less significant than those of c programmers and users of c programs, they are also vastly more concentrated

consequently, the consensus process systematically and reproducibly fails to reflect stakeholder interests

bitwize|1 year ago

Specifically, undefined behavior is when the compiler vendors couldn't agree whether a particular bit of code should legitimately compile to something or be considered erroneous. Ex.: null pointer access. Clearly an error in user-space programs running on a sophisticated operating system, but in kernel or embedded code sometimes you do want to read or write to memory location 0. So the standards committee just shrugged and said "it's undefined". Could be an error, could not be. It depends on your compiler, OS, and environment. Check your local docs for details.

dgfitz|1 year ago

> Stakeholders can afford to rewrite some parts of their compilers to comply with the standards and cannot afford to rewrite other parts to comply with the standards because their customers rely on the existing implementation and/or because of core design decisions.

I was nodding along until here. Wouldn’t one, given the option, always choose, if possible, a compiler that doesn’t differ from the standard? And if that isn’t an option, wouldn’t it be up to said stakeholders to own the inconsistency?

Tough problem to solve for sure.

saghm|1 year ago

The crux of this argument seems to be that the author interprets the "range of permissible behavior" they cite as specifications on undefined behavior as not allowing the sort of optimizations that potentially render anything else in the program moot. A large part of this argument depends arguing that the earlier section defining the term undefined behavior has an "obvious" interpretation that's been ignored in favor of a differing one. I don't think their interpretation of the definition of undefined behavior is necessarily the strongest argument against the case they're making though; to me, the second section they quote is if anything even more nebulous.

To be overly pedantic (which seems to be the point of this exercise), the section cites a "range" of permissible behavior, not an exhaustive list; it doesn't sound to me like it requires that only those three behaviors are allowed. The potential first behavior it includes is "ignoring the situation completely with unpredictable results", followed by "behaving during translation or program execution in a documented manner characteristic of the environment". I'd argue that the behavior this article complains about is somewhere between "willfully ignoring the situation completely with unpredictable results" to "recognizing the situation with unpredictable results", and it's hard for me read this as being obviously outside the range of permissible behavior. Otherwise, it essentially would mean that it's still totally allowed by the standard to have the exact behavior that the author complains about, but only if it's due to the compiler author being ignorant rather than willful. I think it would be a lot weirder if the intent of the standard was that deviant behavior due to bugs is somehow totally okay but purposely writing the same buggy code is a violation.

torstenvl|1 year ago

Expressio unius est exclusio alterius.

mst|1 year ago

The practical reality appears to be that compilers use the loose interpretation of UB and that every compiler that works hard to optimise things as much as possible takes advantage of that as much as it can.

I am very much sympathetic to the people who really wish that wasn't the case, and I appreciate the logic of arguments like this one that in theory it shouldn't be the case, but in practice, it is the case, and has been for some years now.

So it goes.

ajross|1 year ago

I think the problem is sort of a permutation of this argument: way way too much attention is being paid to warning about the dangers and inadequacies of the standard's UB corners, and basically none to a good faith effort to clean up the problem.

I mean, it wouldn't be that hard in a technical sense to bless a C dialect that did things like guarantee 8-bit bytes, signed char, NULL with a value of numerical zero, etc... The overwhelming majority of these areas are just spots where hardware historically varied (plus a few things that were simple mistakes), and modern hardware doesn't have that kind of diversity.

Instead, we're writing, running and trying to understand tools like UBSan, which is IMHO a much, much harder problem.

tialaramex|1 year ago

The exercise you suggest is futile. You've assumed that all these C programmers are writing software with a clear meaning and we just need to properly translate it so that the meaning is delivered.

There were C programmers like that, most of them now write Rust. They write what they meant, in Rust it just does what they wrote, they're happy.

But a large number - by now a majority of the die-hard C programmers - don't want that. They want to write nonsense and have it magically work. They don't need a new C dialect or a better compiler, or anything like that, they need fairy tale magic.

nlewycky|1 year ago

I'm a huge proponent of UBSan and ASan. Genuine curiosity, what don't you like about them?

FWIW, there once was a real good-faith effort to clean up the problems, Friendly C by Prof Regehr, https://blog.regehr.org/archives/1180 and https://blog.regehr.org/archives/1287 .

It turns out it's really hard. Let's take an easy-to-understand example, signed integer overflow. C has unsigned types with guaranteed 2's complement rules, and signed types with UB on overflow, which leaves the compiler free to rewrite the expression using the field axioms, if it wants to. "a = b * c / c;" may emit the multiply and divide, or it can eliminate the pair and replace the expression with "a = b;".

Why do we connect interpreting the top bit as a sign bit with whether field axiom based rewriting should be allowed? It would make sense to have a language which splits those two choices apart, but if you do that, either the result isn't backwards compatible with C anyways or it is but doesn't add any safety to old C code even as it permits you to write new safe C code.

Sometimes the best way to rewrite an expression is not what you'd consider "simplified form" from school because of the availability of CPU instructions that don't match simple operations, and also because of register pressure limiting the number of temporaries. There's real world code out there that has UB in simple integer expressions and relies on it being run in the correct environment, either x86-64 CPU or ARM CPU. If you define one specific interpretation for the same expression, you are guaranteed to break somebody's real world "working" code.

I claim without evidence that trying to fix up C's underlying issues is all decisions like this. That leads to UBSan as the next best idea, or at least, something we can do right now. If nothing else it has pedagogical value in teaching what the existing rules are.

SAI_Peregrinus|1 year ago

The issue is that would not be backwards-compatible with all existing code. People might have to actually fix their programs to work reliably on the hardware they're using. That's almost always considered unacceptable. Also there are still lots of projects using C89, where `gets()` still exists. It got removed in 2011, but if you compile with -std=c89 or -std=c99 it still works, 36 years after the Morris worm should have taught everyone better!

The C standard developers did guarantee 8-bit bytes for C23, so maybe in 50 years that'll be the default C version.

Animats|1 year ago

Well, where have we had trouble in C in the past? Usually, with de-referencing null pointers. The classic is

   char* p = 0;
   char c = *p;
   if (p) {
      ...
   }
Some compilers will observe that de-referencing p implies that P is non-null. Therefore, the test for (p) is unnecessary and can optimized out. The if-clause is then executed unconditionally, leading to trouble.

The program is wrong. On some hardware, you can't de-reference address 0 and the program will abort at "*p". But many machines (i.e. x86) let you de-reference 0 without a trap. This one has caught the Linux kernel devs at least once.

From a compiler point of view, inferring that some pointers are valid is useful as an optimization. C lacks a notation for non-null pointers. In theory, C++ references should never be null, but there are some people who think they're cool and force a null into a reference.

Rust, of course, has

    Option<&Foo>
with unambiguous semantics. This is often implemented with a zero pointer indicating None, but the user doesn't see that.

So, what else? Use after free? In C++, the compiler knows that "delete" should make the memory go away. But that doesn't kill the variable in that scope. It's still possible to reference a gone object. This is common in some old C code, where something is accessed after "free". This is Common Security Weakness #414.[1]

Not a problem in Rust, or any GC language.

Over-optimization in benchmarks can be amusing.

   for (i=0; i<100000000; i++) {}
will be removed by many compilers today. If the loop body is identical every time, it might only be done once. This is usually not a cause of bad program behavior. The program isn't wrong, just pointless.

What else is a legit problem?

[1] https://cwe.mitre.org/data/definitions/416.html

mianos|1 year ago

I can't see what is 'undefined' here. I would expect the program to read the first byte of memory and test if it is 0 or not. If I was writing this in assembly for an MCU, I would write exactly the same code in the target instructions.

There may be many environments where this would be invalid, but why would the compiler optimise this out based on, say, the operating system, if it is valid code?

tialaramex|1 year ago

> This is often implemented with a zero pointer indicating None, but the user doesn't see that.

The Guaranteed Niche Optimisation is, as its name suggests, guaranteed by the Rust language. That is, Option<&T> is guaranteed to be the same size as &T. The choice for the niche to be the all-zero bit representation is in some sense arbitrary but I believe it is a written promise too.

jcranmer|1 year ago

So a while back, I did some spelunking into the history of C99 to actually try to put the one-word-change-theory to bed, but I've never gotten around to writing anything that's public on the internet yet. I guess it's time for me to rectify it.

Tracking down the history of the changes at that time is a bit difficult, because there's clearly multiple drafts that didn't make it into the WG14 document log (this is the days when the document log was literal physical copies being mailed to people), and the drafts in question are also of a status that makes them not publicly available. Nevertheless, by reading N827 (the editors report for one of the drafts), we do find this quote about the changes made:

> Definitions are only allowed to contain actual definitions in the normative text; anything else must be a note or an example. Things that were obviously requirements have been moved elsewhere (generally to Conformance, see above), the examples that used to be at the end of the clause have been distributed to the appropriate definitions, anything else has been made into a note. (Some of the notes appear to be requirements, but I haven't figured out a good place to put them yet.)

In other words, the change seems to be have made purely editorially. The original wording was not intended to be read as imposing requirements, and the change therefore made it a note instead of moving it to conformance. This is probably why "permissible" became "possible": the former is awkward word choice for non-normative text.

Second, the committee had, before this change, discussed the distinctions between implementation-defined, unspecified, and undefined behavior in a way that makes it clear that the anything-goes interpretation is intentional. Specifically, in N732, one committee introduces unspecified behavior as consisting of four properties: 1) multiple possible behaviors; 2) the choice need not be consistent; 3) the choice need not be documented; and 4) the choice needs to not have long-range impacts. Drop the third option, and you get implementation-defined behavior; drop the fourth option, and you get undefined behavior[1]. This results in a change to the definition of unspecified and implementation-defined behavior, while undefined behavior retains the same. Notice how, given a chance to very explicitly repudiate the notion that undefined behavior has spooky-action-at-a-distance, the committee declined to, and it declined to before the supposed critical change in the standard.

Finally, the C committee even by C99 was explicitly endorsing optimizations permitted only by undefined behavior. In N802, a C rationale draft for C99 (that again predates the supposed critical change, which was part of the new draft in N828), there is this quote:

> The bitwise logical operators can be arbitrarily regrouped [converting `(a op b) op c` to `a op (b op c)`], since any regrouping gives the same result as if the expression had not been regrouped. This is also true of integer addition and multiplication in implementations with twos-complement arithmetic and silent wraparound on overflow. Indeed, in any implementation, regroupings which do not introduce overflows behave as if no regrouping had occurred. (Results may also differ in such an implementation if the expression as written results in overflows: in such a case the behavior is undefined, so any regrouping couldn’t be any worse.)

This is the C committee, in 1998, endorsing an optimization relying on the undefined nature of signed integer overflow. If the C committee is doing that way back then, then there is really no grounds one can stand on to claim that it was somehow an unintended interpretation of the standard.

[1] What happens if you want to drop both the third and fourth option is the point of the paper, with the consensus seeming to be "you don't want to do both at the same time."

kazinator|1 year ago

This drivel is posted in a private blog precisely in order to evade expert arguments.

mianos|1 year ago

It's quite ironic that a blog named 'keeping simple' is summarised by GPT as:

"In summary, the essay employs a hyperbolic tone to argue that the prevailing interpretation of undefined behavior has severely compromised the utility and stability of C. While it raises valid points about the implications of undefined behavior, the dramatic language and sweeping claims might make the situation appear more catastrophic than is universally agreed upon."

I know it's bad form to quote GPT, but I could not say this better.

As someone who writes C and C++ every day of the week I feel I just wasted 30 minutes of my life reading it and the arguments.