I have a question I've always wanted to know but too embarrassed to ask (Especially because I've extensively used C for well over a decade now and am intimately familiar with it):
Who exactly are these new C-standards for?
I interact and use C on an almost daily basis. But almost always ANSI C (and sometimes C99). This is because every platform, architecture, etc has at least an ANSI C compiler in common so it serves as the least common-denominator to make platform-independent code. As such it also serves as a good target for DSLs as a sort of portable-assembly. But when you don't need that, what's the motivation to use C then? If your team is up-to-date enough to quickly adopt C23, then why not just use Rust or (heaven forbid, C++23)?
I'd love to hear from someone who does actively use "modern" C. I would love to be a "modern C" developer - I just don't and can't see its purpose.
An example: The C11 memory model + <stdatomic.h> + many compilers supporting C11 has/had a positive impact on language runtimes. Portable CAS!
> If your team is up-to-date enough to quickly adopt C23, then why not just use Rust or (heaven forbid, C++23)?
Another example: If you're programming e.g. non-internet-connected atomic clocks with weather sensors like those produced by La Crosse, then there's no real security model to define, so retraining an entire team to use Rust wouldn't make much sense. (And, yes, I know that Rust brings with it more than just memory safety, but the semantic overhead comes at a cost.)
Another example: Writing the firmware to drive an ADC and broker communication with an OS driver.
In my case: because writing C code (specifically C99 or later - designated init and compound literals!) gives me joy in a way that neither C++ nor Rust provide (C++ was my go-to language for nearly two decade between ca. 1998 and 2017), and I tinkered with Rust a couple of years ago, enough that I realized that I don't much enjoy it.
IMHO, both C++ and Rust feel too much like puzzle solving ("how do I solve this problem in *C++*" or "how do I solve this problem in *Rust*?"), when writing C code, the programming language disappears and it simply becomes "how do I solve this problem?").
PS: I agree that the C standard isn't all that relevant in practice though, you still need to build and test your code across the relevant compilers.
I'm in the WG14, and I, like you, only use c89. So why does c23 matter? Well in terms of features it matters very little but a big part of wg14s work is clarifying omissions from previous standards. So when c23 specifies something that has been unclear for 30+ years, compiler developers back port it in to older versions of C where it was simply unclear. It matters a lot for things like the memory model and things like that.
Existing C code needs to be maintained, and can take advantage of the newer features when available in the compiler. The Linux kernel is moving to C11, and may move to C17/C23 later. Also not everyone wants to put up with the compilation times, object sizes, and aesthetics of Rust.
For me and for many colleagues in my lab? C is quite big in scientific computing and signal processing. Fortran would be slightly better, and it is widely used, but not directly around me. The C99 standard, which added complex numbers and variable length arrays, was truly a godsend in the field. I cannot imagine working without it.
If you write a numerical algorithm that needs to be run 15 years from now, then C and Fortran are possibly the sanest choices. If you do something in other, fancier, languages, you can be sure that your code will stop working in a few years.
The new C standards are really minor changes to the language, and they happen in the span of a decade. It is quite easy to be up to date. And in the rare case that your old code stops compiling, the previous (less than a handful) versions of the language are all readily available as compiler options in all compilers. You can be reasonably sure that a C program written today will still compile and run in 20 years. You can be 100% sure that a python+numpy program won't. If you care about this (for example, if you are writing a new linear algebra algorithm to factor matrices), then choosing C is a rational, natural choice.
Keep in mind that if you want to write probably-maybe-correct code, Rust is maturing to be able to get you there more easily than C. But if you want actually-correct code, you need to do the legwork regardless of language; and C has a much more mature ecosystem (things like CompCert C, etc) that lets you do much of the analysis portion of that legwork on C code, instead of on generated assembly code as you'd have to do for Rust. Combined with verification costs that don't vary that much from language to language, and there's a long future where, for safety-critical applications, there's no downside to C -- the cost of verification and analysis swamps the cost of writing the code, and the cost of qualifying a new language's toolchain would be absurd. For this reason, C has a long, long future as one of the few languages (along with Ada, where some folk are making a real investment in tool qualification) for critical code; and even if it takes a decade for C23 features to stabilize and make it to this population, well, we'll still be writing C code well beyond '33.
Are you asking about greenfield development only? One big obvious reason to use C23 instead of Rust or C++23 is if you already have a codebase written in C. Switching to C23 is a compiler flag; switching to Rust is a complete rewrite.
Places that are just now adopting C11 will probably adopt C23 in 12 years? C++ is (unfortunately, IMO) making inroads into embedded, but C is also still pretty widely used.
My usages are similar to yours, but new C standards still benefit me because I can opportunistically detect and make use of new features in a configure script.
To use my baby as an example: free_sized(void *ptr, size_t alloc_size) is new in C23. I can detect whether or not it's available and use it if so. If it's not available, I can just fall back to free() and get the same semantics, at some performance or safety cost.
Many many many teams writing C won't be using C23 the day it's out, but they have to get these changes in now if they want the people who always use a 10 year old standard to have these features available 10 years from now
I'm also a C developer, but I do use the more modern versions.
There are four big reasons why:
* Atomics. These are the biggest missing feature in older C.
* Static asserts. I can't tell you how much I love being able to put in a static assert to ensure that my code doesn't compile if I forget to update things. For example, I'll often have static constant arrays tied to the values in an enum. If I update the enum, I want my code to refuse to compile until I update the array. I have 20 instances of static asserts in my current project.
* `max_align_t`. It's super useful to have a type that has the maximum alignment possible on the architecture.
* `alignof()` and friends. It's super useful to get the alignment of various types. Combined with `max_align_t`, it is actually possible to safely write allocators in C. Previously, it wasn't really possible to do safely or portably. And I have at least three allocators in my current project.
You're right that C11 doesn't have nearly the reach the ANSI C does, but it does have slightly more than Rust, much more if you consider Rust's tier 3 support to be iffy, which I do.
And it does have one HUGE advantage against Rust: compile times. On my 16-core machine, I can do a full rebuild in 2.5 seconds. If I changed one file in Rust, it might take that long just to compile that one file.
That's not to say Rust is without advantages; one of my allocators is designed to give me as much of Rust's borrow checker as possible, on top of API's designed around that fact.
tl;dr: I use modern C for a few features not found in C89, for the slightly better platform support against Rust, and for the fast compiles.
Outside our bubble, there’s an _ocean_ of embedded software/firmware and lower level library stuff, on up-to-date platforms, written in C by people or teams that would find switching to Rust just a _massive_ chore. I’d guess there is at least an order of magnitude more of this than Rust.
And I certainly appreciated C11 when writing Objective-C, so I’m sure people with large codebases of ObjC will appreciate it (though most will be using Swift for new features nowadays).
One thing I'm really looking forward to is standardization of binary literals. Bitwise masking makes a lot more sense with binary literals than hex literals.
Old software is very slow and expensive to change. Adopting a new C version doesn't need a failure prone expensive synchronized collective-action rewrite throughout your sectors supply chain, new tooling, platform runtime ports, etc. Rust would.
C provides the only stable ABI for Rust, and changes to the C++ ABI may also occur in the future. So the implications of new C standards for library code are especially relevant.
There is no "modern" C but "C with additional niceties". And those additions are usually low key enough to be adopted by a good portion of the compilers out there.
When you have a C code base or experience with C those features may be enough not to make a complex transition.
Having a simple tool evolve a bit may be what you need as opposed to making the change to a much more complex tool.
“
13. Unlike for C99, the consensus at the London meeting was that there should be no invention, without exception. Only those features that have a history and are in common use by a commercial implementation should be considered. Also there must be care to standardize these features in a way that would make the Standard and the commercial implementation compatible.
”
I read this as saying that anything that gets standardized should be available in one of the major implementations. In practice, most of the qualifying features will have been implemented in both GCC and Clang in the same way, so for most users, there is not much benefit from standardization. Some may feel compelled to support the ”standard” way and the “GCC/Clang” way in the same sources, using a macro, but that isn't much of a win in most cases. Of course, there will be shops that say, “we can't use feature until it's in the standard”, but that never really made sense to me.
Things are considerably murky on the library side. In my experience, library features rarely get standardized in the same way they are already deployed: names change, types change, behavioral requirements are subtly different. (Maybe this is my bias from the library side because I see more such issues.) For programmers, the problem of course is that typical applications do not get exposed to different compiler versions at run time, but it's common for this to happen with the system libraries. This means that the renaming churn that appears to be inherent to standardization causes real problems.
Others have said that new standards are an opportunity to clarify old and ambiguous wording, but in many cases the ambiguity hides unresolved conflict (read: different behavior in existing implementations) in the standardization committee. It's really hard to revise the wording without making things worse, see realloc.
So I'm also not sure what value standardization brings to users of GCC and Clang. Maybe it's different for those who use other compilers. But if standardization is the only way these other vendors implement widely used GCC and Clang extensions (or interfaces common to the major non-embedded C libraries), then the development & support mode for these other implementations does not seem quite right.
Not new in C23, but I still think it's a glaring hole in the standard that there's still no standard way to ask the compiler which (if any) of "J.5 Common extensions" is supported.
For the C version you have __STDC_VERSION__, but there's no similar facility to check if e.g. J.5.7 is supported, which effectively makes the behavior that's explicitly omitted in 7.22.1.4 and 6.3.2.3 go from "undefined" to supported by C23 + the extension.
I understand why C can't have some generic "is this undefined?" test, but it seems weird not to be able to ask if extensions defined in the standard itself are in effect, as they define certain otherwise undefined behavior. The effect is that anyone using these extensions must be intimately familiar with all the compilers they're targeting.
>> One major change in the language is that two’s complement is now mandatory as a representation for signed types.
This pleases me greatly. Two's complement won decades ago. This also means they could define integer overflow as 2's complement rollover, which is almost universal but is still considered undefined behavior.
The content seems to be hosted in a GitLab repo with the raw view serving the HTML but with the wrong content type. The JS is there to fix that. Weird setup.
"Extended integer types may be wider than intmax_t". I'm sure there's a good reason for this, but it was introduced in C99, which says (in 7.8.1.5): "[intmax_t] designates a signed integer type capable of representing any value of any signed integer type".
That was already portable between 16 bit, 32 bit, 64 bit etc. Why is it that just because the compiler supports 128 bit or 256 bit integers that compiling in such a mode doesn't correspondingly update "[u]intmax_t"?
The linked page says they 'cannot be "extended integer types" in the sense of C17', but that printf() and scanf() should still support these?
Presumably because the size of intmax_t was set to something specific on its introduction and changing it now would constitute an ABI break most everywhere.
Technically, it would break ABIs. I agree, they should have just broken any ABI relying on intmax_t and made it definitionally unstable (instead of making it useless, which they've chosen to do instead).
I suppose because they are an ELF feature rather than a language feature?
Anyway you can (and should!) use -fvisibility=hidden and add __attribute__((__visibility__("default"))) to public symbols when writing a C library. It will make calls between non-visible symbols faster because the compiler doesn't have to generate code to handle ELF symbol interposition.
Not surprising, though. The issues surrounding if/how a system provides libraries of any kind (static or shared) are completely implementation-defined. The Standard doesn't have anything to say about them at all except for the program-wide symbol scope rules.
[+] [-] vbtemp|3 years ago|reply
Who exactly are these new C-standards for?
I interact and use C on an almost daily basis. But almost always ANSI C (and sometimes C99). This is because every platform, architecture, etc has at least an ANSI C compiler in common so it serves as the least common-denominator to make platform-independent code. As such it also serves as a good target for DSLs as a sort of portable-assembly. But when you don't need that, what's the motivation to use C then? If your team is up-to-date enough to quickly adopt C23, then why not just use Rust or (heaven forbid, C++23)?
I'd love to hear from someone who does actively use "modern" C. I would love to be a "modern C" developer - I just don't and can't see its purpose.
[+] [-] acuozzo|3 years ago|reply
An example: The C11 memory model + <stdatomic.h> + many compilers supporting C11 has/had a positive impact on language runtimes. Portable CAS!
> If your team is up-to-date enough to quickly adopt C23, then why not just use Rust or (heaven forbid, C++23)?
Another example: If you're programming e.g. non-internet-connected atomic clocks with weather sensors like those produced by La Crosse, then there's no real security model to define, so retraining an entire team to use Rust wouldn't make much sense. (And, yes, I know that Rust brings with it more than just memory safety, but the semantic overhead comes at a cost.)
Another example: Writing the firmware to drive an ADC and broker communication with an OS driver.
Another example: The next Furby!
[+] [-] flohofwoe|3 years ago|reply
IMHO, both C++ and Rust feel too much like puzzle solving ("how do I solve this problem in *C++*" or "how do I solve this problem in *Rust*?"), when writing C code, the programming language disappears and it simply becomes "how do I solve this problem?").
PS: I agree that the C standard isn't all that relevant in practice though, you still need to build and test your code across the relevant compilers.
[+] [-] quelsolaar|3 years ago|reply
[+] [-] layer8|3 years ago|reply
As for new developments, see for example https://news.ycombinator.com/item?id=33675462 which uses C11.
[+] [-] enriquto|3 years ago|reply
For me and for many colleagues in my lab? C is quite big in scientific computing and signal processing. Fortran would be slightly better, and it is widely used, but not directly around me. The C99 standard, which added complex numbers and variable length arrays, was truly a godsend in the field. I cannot imagine working without it.
If you write a numerical algorithm that needs to be run 15 years from now, then C and Fortran are possibly the sanest choices. If you do something in other, fancier, languages, you can be sure that your code will stop working in a few years.
The new C standards are really minor changes to the language, and they happen in the span of a decade. It is quite easy to be up to date. And in the rare case that your old code stops compiling, the previous (less than a handful) versions of the language are all readily available as compiler options in all compilers. You can be reasonably sure that a C program written today will still compile and run in 20 years. You can be 100% sure that a python+numpy program won't. If you care about this (for example, if you are writing a new linear algebra algorithm to factor matrices), then choosing C is a rational, natural choice.
[+] [-] LAC-Tech|3 years ago|reply
There's a lot of reasons to use C23 over rust
- multiple compiler implementations
- works on more platforms
- defined standard
- ability to create self-referential data structures without hacky workarounds
- immediate, easy access to large numbers of C libraries
(For the record I like rust, but the evangelism over the past half decade has been pretty ridiculous. Consider this counter propaganda).
[+] [-] addaon|3 years ago|reply
[+] [-] electroly|3 years ago|reply
[+] [-] aidenn0|3 years ago|reply
[+] [-] davidtgoldblatt|3 years ago|reply
To use my baby as an example: free_sized(void *ptr, size_t alloc_size) is new in C23. I can detect whether or not it's available and use it if so. If it's not available, I can just fall back to free() and get the same semantics, at some performance or safety cost.
[+] [-] connicpu|3 years ago|reply
[+] [-] gavinhoward|3 years ago|reply
There are four big reasons why:
* Atomics. These are the biggest missing feature in older C.
* Static asserts. I can't tell you how much I love being able to put in a static assert to ensure that my code doesn't compile if I forget to update things. For example, I'll often have static constant arrays tied to the values in an enum. If I update the enum, I want my code to refuse to compile until I update the array. I have 20 instances of static asserts in my current project.
* `max_align_t`. It's super useful to have a type that has the maximum alignment possible on the architecture.
* `alignof()` and friends. It's super useful to get the alignment of various types. Combined with `max_align_t`, it is actually possible to safely write allocators in C. Previously, it wasn't really possible to do safely or portably. And I have at least three allocators in my current project.
You're right that C11 doesn't have nearly the reach the ANSI C does, but it does have slightly more than Rust, much more if you consider Rust's tier 3 support to be iffy, which I do.
And it does have one HUGE advantage against Rust: compile times. On my 16-core machine, I can do a full rebuild in 2.5 seconds. If I changed one file in Rust, it might take that long just to compile that one file.
That's not to say Rust is without advantages; one of my allocators is designed to give me as much of Rust's borrow checker as possible, on top of API's designed around that fact.
tl;dr: I use modern C for a few features not found in C89, for the slightly better platform support against Rust, and for the fast compiles.
[+] [-] jrmg|3 years ago|reply
And I certainly appreciated C11 when writing Objective-C, so I’m sure people with large codebases of ObjC will appreciate it (though most will be using Swift for new features nowadays).
[+] [-] torstenvl|3 years ago|reply
Example:
https://pasteboard.co/VkjrJIOZzaiR.jpg
(Sorry for pasting code as an image, I'm on my phone)
[+] [-] fulafel|3 years ago|reply
[+] [-] zozbot234|3 years ago|reply
[+] [-] sharikous|3 years ago|reply
When you have a C code base or experience with C those features may be enough not to make a complex transition.
Having a simple tool evolve a bit may be what you need as opposed to making the change to a much more complex tool.
[+] [-] fweimer|3 years ago|reply
This text is still in force, it seems:
“ 13. Unlike for C99, the consensus at the London meeting was that there should be no invention, without exception. Only those features that have a history and are in common use by a commercial implementation should be considered. Also there must be care to standardize these features in a way that would make the Standard and the commercial implementation compatible. ”
I read this as saying that anything that gets standardized should be available in one of the major implementations. In practice, most of the qualifying features will have been implemented in both GCC and Clang in the same way, so for most users, there is not much benefit from standardization. Some may feel compelled to support the ”standard” way and the “GCC/Clang” way in the same sources, using a macro, but that isn't much of a win in most cases. Of course, there will be shops that say, “we can't use feature until it's in the standard”, but that never really made sense to me.
Things are considerably murky on the library side. In my experience, library features rarely get standardized in the same way they are already deployed: names change, types change, behavioral requirements are subtly different. (Maybe this is my bias from the library side because I see more such issues.) For programmers, the problem of course is that typical applications do not get exposed to different compiler versions at run time, but it's common for this to happen with the system libraries. This means that the renaming churn that appears to be inherent to standardization causes real problems.
Others have said that new standards are an opportunity to clarify old and ambiguous wording, but in many cases the ambiguity hides unresolved conflict (read: different behavior in existing implementations) in the standardization committee. It's really hard to revise the wording without making things worse, see realloc.
So I'm also not sure what value standardization brings to users of GCC and Clang. Maybe it's different for those who use other compilers. But if standardization is the only way these other vendors implement widely used GCC and Clang extensions (or interfaces common to the major non-embedded C libraries), then the development & support mode for these other implementations does not seem quite right.
[+] [-] avar|3 years ago|reply
For the C version you have __STDC_VERSION__, but there's no similar facility to check if e.g. J.5.7 is supported, which effectively makes the behavior that's explicitly omitted in 7.22.1.4 and 6.3.2.3 go from "undefined" to supported by C23 + the extension.
I understand why C can't have some generic "is this undefined?" test, but it seems weird not to be able to ask if extensions defined in the standard itself are in effect, as they define certain otherwise undefined behavior. The effect is that anyone using these extensions must be intimately familiar with all the compilers they're targeting.
[+] [-] phkahler|3 years ago|reply
This pleases me greatly. Two's complement won decades ago. This also means they could define integer overflow as 2's complement rollover, which is almost universal but is still considered undefined behavior.
[+] [-] layer8|3 years ago|reply
[+] [-] zajio1am|3 years ago|reply
No, they should not. Integer overflow is in most cases logic error, like division by zero or NULL pointer dereference, so it should stay undefined.
[+] [-] rwmj|3 years ago|reply
Edit: Ugh, it needs javascript to render simple HTML.
[+] [-] ainar-g|3 years ago|reply
https://icube-forge.unistra.fr/icps/c23-library/-/blob/main/...
Raw Markdown, for no-JS readers:
https://icube-forge.unistra.fr/icps/c23-library/-/raw/main/R...
[+] [-] account42|3 years ago|reply
[+] [-] jwilk|3 years ago|reply
https://archive.today/Hz81c
[+] [-] turminal|3 years ago|reply
[+] [-] avar|3 years ago|reply
That was already portable between 16 bit, 32 bit, 64 bit etc. Why is it that just because the compiler supports 128 bit or 256 bit integers that compiling in such a mode doesn't correspondingly update "[u]intmax_t"?
The linked page says they 'cannot be "extended integer types" in the sense of C17', but that printf() and scanf() should still support these?
[+] [-] tpush|3 years ago|reply
[+] [-] loeg|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] zxwrt|3 years ago|reply
[+] [-] rwmj|3 years ago|reply
Anyway you can (and should!) use -fvisibility=hidden and add __attribute__((__visibility__("default"))) to public symbols when writing a C library. It will make calls between non-visible symbols faster because the compiler doesn't have to generate code to handle ELF symbol interposition.
[+] [-] brandmeyer|3 years ago|reply
[+] [-] GrumpySloth|3 years ago|reply
Finally! My joy knows no bounds.
[+] [-] MaxBarraclough|3 years ago|reply
Much of the time, neither does a C compiler.
[+] [-] turminal|3 years ago|reply
They seem pretty harmless, are they difficult to implement?
[+] [-] ufo|3 years ago|reply
[+] [-] pdw|3 years ago|reply
[+] [-] mananaysiempre|3 years ago|reply
[+] [-] sigsev_251|3 years ago|reply
[+] [-] rurban|3 years ago|reply
[+] [-] dottedmag|3 years ago|reply
> Extended integer types may be wider than intmax_t
[+] [-] rwmj|3 years ago|reply
[+] [-] 4str0n0mer|3 years ago|reply
[deleted]