None of the alternatives have stability. What was exemplary & idiomatic rust pre-pandemic would be churlish & rejected now and the toolchain use would be different anyway.
Carpenters, plumbers, masons & electricians work on houses 3-300 yrs old, navigate the range of legacy styles & tech they encounter, and predictably get good outcomes.
Only C has, yet, given use that level of serviceability. C99, baby, why pay more?
When there’s an alternative that can compete with that sort of real-world durability, C will get displaced.
Having just finished renovating a 140-year-old home with solid brick walls that was slowly collapsing and deteriorating due to the aforementioned professionals’ application of modern and chemically incompatible materials to it… I’m not sure I agree. It’s also why a lot of the UK’s building stock is slowly rotting with black mould right now. Literally none of the professionals I hired, before I trained them, knew how to properly repair a type of home that represents 30% of the UK building stock.
> Only C has, yet, given use that level of serviceability.
On the contrary, Lisp outshines C to a large degree here. Success has nothing to do with technical merit (if such a thing even exists), it's not a rational game.
FWIW, the crabi project within rust is trying to improve on some parts of it. But it is still built on (a subset of) the environments c ABI. And it doesn't fix all the problems.
C99, but with a million macros backporting features from newer language versions and compiler extensions. Lovely features you don't get with ordinary c99:
The replacement has already happened. It is HTTP and JSON for 99% of the software developed today. The reason C stayed has multiple reasons but most obvious ones are for me are:
- People just stopped caring about operating systems research and systems programming after ~2005. Actual engineering implementations of the concepts largely stopped after the second half of 90s. Most developers moved on to making websites or applications in higher level programming languages.
- C hit a perfect balance of being a small enough language to grok, being indepedent of the system manufacturers, reflecting the computer architecture of 80s, actually small in syntax and code length and quite easy to implement compilers for. This caused lots of legacy software being built into the infrastructure that gave birth to the current contemporary popular OSes and more importantly the infrastructure of the Internet. Add in .com bubbles and other crises, we basically have/had zero economic incentive to replace those systems.
- Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
It’s weird how whiny this post is. Like there’s zero intellectual curiosity about why C got this way, and why C gets to be the foundation for how systems software is written.
I could write a whole essay about why, but now isn’t the time. I’m just going to enjoy the fact that TFA and the author don’t get it.
> why C gets to be the foundation for how systems software is written.
Is there an answer here more interesting than "it's what Unix and Windows were written in, so that's how programs talked to the OS, and once you have an interface, it's impossible to change"?
The author is upfront about their goals and motivations and explicitly acknowledges that other concerns exist. Calling it whiny is ungracious -- the author is letting some very human frustration peek through in their narrative.
Not everything has to be written with all the warmth and humanity of a UN subcommittee interim report on widget standardisation.
- unspecified default type sizes. Should have had i8, u16, i32, u64, f32, f64 from the beginning.
- aliasing pointers being restricted by default (ie an alias keyword should have been added). Performance matters. All these benchmarks which show something beating C or C++ are mostly due to dealing with aliasing pointers. C++26 still doesnt have standardised restrict keyword.
There are more but I understand the logic/usability/history behind them. The above points should have been addressed in the 80’s.
Not the error "handling"? The array implementation? The weak type system? The barebones-macro-system? The nearly unuseable standard-library? The standard itself, a 750-page-tome you have to memorize, lest C is allowed to erase your hard drive?
Which systems in the 70s or even 90s would have had good hardware support for all these lengths?
I think overall perhaps its better that the thing named int works on both the arduino and my amd64 otherwise we might needlessly need to rename all types for each platform to fit the most natural type lengths there. Imagine the 32 bit to 64 bit transition under those conditions.
> - unspecified default type sizes. Should have had i8, u16, i32, u64, f32, f64 from the beginning.
I strongly disagree. The programmer should rather prescribe intent and shouldn't constantly think about what size this should exactly have. Does this variable represent the size of an object, an address, a difference or just a random positive integer? Then use size_t, uintptr_t, ptrdiff_t and unsigned int respectively. Why should I care what exact sizes these are? I hate that modern (system) languages completely throw away that concept. "I want a unsigned integer" "oh, you mean u32" "No! I really mean an unsigned integer."
Also when I do want e.g. 32 bit available, there is no reason, I need to use a suboptimal 32 bit wrapping behaviour when I don't need it. The correct type to use for computation for that would be uint32_fast_t and the implementation chooses what makes sense to use for this.
I really don't understand why people keep misunderstanding this post so badly. It's not a complaint about C as a programming language. It's a complaint that, due to so much infrastructure being implemented in C, anyone who wants to interact with that infrastructure is forced to deal with some of the constraints of C. C has moved beyond merely being a programming language and become the most common interface for in-process interoperability between languages[1], and that means everyone working at that level needs to care about C even if they have no intention of writing C.
It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.
(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)
[1] I absolutely buy the argument that HTTP probably wins out for out of process
I don't see that as a problem. C has been the bedrock of computing since the 1970s because it is the most minimal way of speaking to the hardware in a mostly portable way. Anything can be done in C, from writing hardware drivers, to GUI applications and scientific computing. In fact I deplore the day people stopped using C for desktop applications and moved to bloated, sluggish Web frameworks to program desktop apps. Today's desktop apps are slower than Windows 95 era GUI programs because of that.
> could things be better if we had an explicitly designed interoperability interface?
Yes, we could define a language-agnostic binary interoperability standard with it's own interface definition language, or IDL. Maybe call it something neutral like the component object model, or just COM[1]. :)
Of course things could be better. That doesn’t mean that we can just ignore the constraints imposed by the existing software landscape.
It’s not just C. There are a lot of things that could be substantially better in an OS than Linux, for example, or in client-server software and UI frameworks than the web stack. It nevertheless is quite unrealistic to ditch Linux or the web stack for something else. You have to work with what’s there.
The trouble with C as an API format is that there's no size info. That's asking for buffer overflows.
There's an argument for full type info at an API, but that gets complicated across languages. Things that do that degenerate into CORBA. Size info, though, is meaningful at the machine level, and ought to be there.
Apple originally had Pascal APIs for the Mac, which did carry along size info. But they caved and went with C APIs.
I mean… an ABI is more like an agreement. It isn’t a specification. It’d be nice if everything was neatly specified and sound. But as the author notes near the end… there’s a lot of platforms and they all have their own quirks.
There has to be an ABI that has to be agreed upon by everyone. Otherwise there wouldn’t be any interoperability. And if we didn’t have the SystemV ABI — what would we use instead? Prepare for a long debate as every language author, operating system designer, and platform under the sun argues for their respective demands and proposals. And as sure as the sun rises in the East someone, somewhere, would write an article such as this one decrying that blessed ABI.
SystemV shouldn’t be the be all and end all, IMO. But progress should be incremental. Because a lingua franca loses its primary feature and utility when we all return to our own fiefdoms and stop talking to one another in the common tongue.
It’s a pain in the metaphorical butt. But it’s better, IMO, than the alternatives. It’s kind of neat that SystemV works so well let alone at all.
This article isn't about languages. It's about the protocol for two or more languages to talk to each other. There is no specification for this.
The System V ABI is as close as we get to an actual specification but not everyone uses it and in any case it only covers a small part of the protocol.
We didn't do it to annoy you or to foist bad APIs on you. We did it because it was the best language for writing machine code at the time. By miles. Not understanding why this is true will lead you to make all the same mistakes the languages "bested" by C made.
I remember reading some crank on slashdot decades ago seriously claiming that Windows was not an operating system. This type of argument hasn’t gotten any more intelligent since.
Following up, I see that's not really what the author was trying to say: it's that for a growing number of purposes, like gluing together two languages that aren't C, that the C ABI is still dominating things where the C language shouldn't even be involved.
Write clickbait, get kneejerk. I guess some other things are at least as old as C.
> Anyone who spends much time trying to parse C(++) headers very quickly says “ah, actually, fuck that” and asks a C(++) compiler to do it.
That's exactly my case. For my programming language I have wrote a tool for C headers conversion using libclang. And even with help of this library it wasn't that easy, I have found a lot of caveats by trying converting headers like <windows.h>.
I always thought that C was a stepping stone to learn other languages. Like Pascal, it was educational to learn. My Comp Sci courses in 1986-1990 used Turbo Pascal and Turbo C.
I think so to, for most devs C is like Latin, or Roman Law, not something we develop and use, but rather something we learn for context and to understand future developments.
There's some people that still develop on C for sure, but it's limited to FOSS and embedded at this point, Low Level proprietary systems having migrated to C++ or Rust mostly.
I agree with the main thesis that C isn't a language like the others, something that we practice, that it's mostly an ancient but highly influential language, and it's an API/ABI.
What I disagree with is that 'critiquing' C is productive in the same way that critiquing Roman Law or Latin or Plato is productive, the horse is dead, one might think they are being clever or novel for finding flaws in the dead horse, but it's more often a defense mechanism to justify having a hard time learning the decades of backwards compatibility, edge cases and warts that have been developed.
It's easier to think of the previous generation as being dumb and having made mistakes that could have been fixed, and that it all could be simpler, rather than recognize that engineering is super complex and that we might as well dedicate our full life to learning this craft and still not make a dent.
I applaud the new generation for taking on this challenge and giving their best shot at the revolution, but I'm personally thinking of bridging the next-next generation and the previous generation of devs, the historical complexity of the field will increase linearly with time and I think if we pace ourselves we can keep the complexity down, and the more times we hop unto a revolution that disregards the previous generation as dumb, the bigger the complexity is going to be.
C isn't really the protocol though, its just a way every language has of exporting simple symbols. Otherwise how else does this work, you'd never every language to understand every other languages symbol name encoding scheme, some of which are complete jibberish (I'm looking hard your way C++).
The real protocol in action here is symbolic linking and hardware call ABIs.
You could always directly call Rust functions, but you'd have to know where to symbolically look for them and how to craft its parameters for example.
If this is well defined then its possible. If its poorly defined or implementation specific (c++) then yeah its a shit show that is not solvable.
OP, I would strongly recommend you try to play devil's advocate against your own case before making it, so you have a chance to refine your thoughts and rebut at least the top 2-3 counterarguments. More than that, you would do well to understand the concept of Chesterton's Fence as it applies to engineering:
"In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, "I don't see the use of this; let us clear it away." To which the more intelligent type of reformer will do well to answer: "If you don't see the use of it, I certainly won't let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.""
-G.K. Chesterton
#1: What you consider a "language" is one with its own official runtime. By your logic, Javascript is not a language either because the interpreter is usually written in C++ (e.g. V8, spidermonkey).
#2: Dogfooding a compiler and a runtime is common practice because most people believe that it will help the makers of a language identify bugs in their implementation or identify undefined behavior in the very definition of their language. However, every language must bootstrap from another language: most languages bootstrapped from a compiler written in C. If you trace the lineage from compiler to compiler to ... compiler, you'll most likely find that the second C compiler ever written was written in C and the first compiler was written in some architecture's assembly language. The same still applies for the new, new language of C99 which is less than half the age of C. C is so mature that rooting out new undefined behaviors is no longer a significant concern.
#3: libc is not just a "language runtime", it's the interface into the OS kernel for most operating systems. The runtime for every "real language" as you describe it ultimately uses libc. It doesn't make sense to separate the functions in the C99 standard from the OS's own syscalls and special-purpose user land functions call C standard library functions, and C's standard library relies on the OS's special functions. If they were separate libraries, there would be a circular dependency.
Think of C not as "not a language" but as a language that serves the unusual purpose of being the foundation for nearly everything else.
You have a point on integer types being a mess, but we have stdint.h for that.
Do someone pays for anti-C propaganda ?? All that logic breaking accusations...
Eg. here, from memory:
> ...you want to read 32 bits from file but OH NOOES long is 64 bit ! The language ! The imposibility !
But when you read something ot unserialize some format you just need to know based on format schema or domain knowledge. Simple and straightforward like that ! You do not do some "reflections" on what language standard provide and then expect someone send you just that !!
So that anti-C "movement" is mostly based on brainless exampless.
Not saying C is perfect.
But it is very good and I bet IBM and other big corps will keep selling things written and actively developed in C/C++ + adding hefty consulting fees.
In the meantime proles has been adviced to move to cpu-cycle-eating inferior languages and layers over layers of cycle burning infra in cloud-level zero-privacy and guaranteed data leaks.
Oh, btw. that femous Java "bean" is just object with usually language delivered "basic type"... How that poor programmer from article should know what to read from disc when he just have types Java provides ?? How ? Or maybe he should use some domain knowledge or schema for problem he is trying to solve ??
And in "scripting language" with automatic int's - how to even know how many bits runtime/vm actually use ? Maybe some reflection to check type ? But again how that even helps if there is no knowledge in brain how many bits should be read ?? But calling some cycle burning reflection or virtual and as much as posible indirect things is what fat tigers love the moust :)
This is just an ad hominem attack. Doesn't seem like the author is "in over their head"; they seem to have a pretty solid grasp of actual identifiable gaps between implementations and various specs, and the article was written with the same kind of "chastising" tone as you would see from any grey-bearded hacker who's unsatisfied with the way things are.
This is the same author who found and reported ABI incompatibilities between clang and gcc: https://faultlore.com/abi-cafe/book/trophies.html — Given these incompatibilities have since been addressed, I think it's fair to say there were indeed issues with C.
nacozarina|25 days ago
Carpenters, plumbers, masons & electricians work on houses 3-300 yrs old, navigate the range of legacy styles & tech they encounter, and predictably get good outcomes.
Only C has, yet, given use that level of serviceability. C99, baby, why pay more?
When there’s an alternative that can compete with that sort of real-world durability, C will get displaced.
cannonpr|25 days ago
ux266478|25 days ago
On the contrary, Lisp outshines C to a large degree here. Success has nothing to do with technical merit (if such a thing even exists), it's not a rational game.
pjmlp|24 days ago
Even the most relevant C compilers are no longer written in C.
thayne|25 days ago
charleslmunger|25 days ago
free_sized
#embed
static_assert
Types for enum
Alignof, alignas, aligned_alloc
_Atomic
okanat|25 days ago
- People just stopped caring about operating systems research and systems programming after ~2005. Actual engineering implementations of the concepts largely stopped after the second half of 90s. Most developers moved on to making websites or applications in higher level programming languages.
- C hit a perfect balance of being a small enough language to grok, being indepedent of the system manufacturers, reflecting the computer architecture of 80s, actually small in syntax and code length and quite easy to implement compilers for. This caused lots of legacy software being built into the infrastructure that gave birth to the current contemporary popular OSes and more importantly the infrastructure of the Internet. Add in .com bubbles and other crises, we basically have/had zero economic incentive to replace those systems.
- Culture changed. We cared more about stability, repairability and reusability. Computers were expensive. So are programmers and software. Now computers are cheap. Our culture is more consumerist than ever. The mentality of "move fast and break things" permeated so well with economic policy and the zeitgeist. With AI it will get worse. So trying to make a real alternative to C (as a generic low level OS protocol) has reduced cultural value / optics. It doesn't fill the CVs as well.
It doesn't mean that people haven't tried or even succeeded. Android was successful in multiple fronts in replacing C. Its "intents" and low level interface description language for hardware interfaces are great replacement for C ABI. Windows' COM is also a good replacement that gets rid of language dependence. There are still newer OSes try like Redox or Fuchsia.
NuclearPM|24 days ago
pizlonator|25 days ago
I could write a whole essay about why, but now isn’t the time. I’m just going to enjoy the fact that TFA and the author don’t get it.
munificent|25 days ago
Is there an answer here more interesting than "it's what Unix and Windows were written in, so that's how programs talked to the OS, and once you have an interface, it's impossible to change"?
exidy|25 days ago
Not everything has to be written with all the warmth and humanity of a UN subcommittee interim report on widget standardisation.
ranger_danger|25 days ago
yunnpp|25 days ago
smallstepforman|25 days ago
- unspecified default type sizes. Should have had i8, u16, i32, u64, f32, f64 from the beginning.
- aliasing pointers being restricted by default (ie an alias keyword should have been added). Performance matters. All these benchmarks which show something beating C or C++ are mostly due to dealing with aliasing pointers. C++26 still doesnt have standardised restrict keyword.
There are more but I understand the logic/usability/history behind them. The above points should have been addressed in the 80’s.
krior|25 days ago
C is sin incarnated.
abcd_f|25 days ago
As you said, it's easy to see where it came from, but it should've been fixed long ago.
donkeybeer|24 days ago
1718627440|24 days ago
I strongly disagree. The programmer should rather prescribe intent and shouldn't constantly think about what size this should exactly have. Does this variable represent the size of an object, an address, a difference or just a random positive integer? Then use size_t, uintptr_t, ptrdiff_t and unsigned int respectively. Why should I care what exact sizes these are? I hate that modern (system) languages completely throw away that concept. "I want a unsigned integer" "oh, you mean u32" "No! I really mean an unsigned integer."
Also when I do want e.g. 32 bit available, there is no reason, I need to use a suboptimal 32 bit wrapping behaviour when I don't need it. The correct type to use for computation for that would be uint32_fast_t and the implementation chooses what makes sense to use for this.
mjg59|25 days ago
It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.
(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)
[1] I absolutely buy the argument that HTTP probably wins out for out of process
drnick1|25 days ago
tragiclos|25 days ago
Yes, we could define a language-agnostic binary interoperability standard with it's own interface definition language, or IDL. Maybe call it something neutral like the component object model, or just COM[1]. :)
[1] https://en.wikipedia.org/wiki/Component_Object_Model
SanjayMehta|25 days ago
Verilog is loosely based on C. Most designs are done in Verilog.
layer8|24 days ago
It’s not just C. There are a lot of things that could be substantially better in an OS than Linux, for example, or in client-server software and UI frameworks than the web stack. It nevertheless is quite unrealistic to ditch Linux or the web stack for something else. You have to work with what’s there.
Animats|25 days ago
There's an argument for full type info at an API, but that gets complicated across languages. Things that do that degenerate into CORBA. Size info, though, is meaningful at the machine level, and ought to be there.
Apple originally had Pascal APIs for the Mac, which did carry along size info. But they caved and went with C APIs.
agentultra|25 days ago
There has to be an ABI that has to be agreed upon by everyone. Otherwise there wouldn’t be any interoperability. And if we didn’t have the SystemV ABI — what would we use instead? Prepare for a long debate as every language author, operating system designer, and platform under the sun argues for their respective demands and proposals. And as sure as the sun rises in the East someone, somewhere, would write an article such as this one decrying that blessed ABI.
SystemV shouldn’t be the be all and end all, IMO. But progress should be incremental. Because a lingua franca loses its primary feature and utility when we all return to our own fiefdoms and stop talking to one another in the common tongue.
It’s a pain in the metaphorical butt. But it’s better, IMO, than the alternatives. It’s kind of neat that SystemV works so well let alone at all.
hacker_homie|25 days ago
The whole world shouldn't "need to be fixed" because you won't spend the time to learn something.
Rust doesn't even have a stable Internal ABI that's why you have to re-compile everything all the time.
cornhole|23 days ago
anthk|24 days ago
With C++ it's the same. Within the Haiku code it's half understandable, the whole spec it's to get driven mad in days.
mayhemducks|24 days ago
Good read though. Thinking about C as not just a language but also a protocol is a different perspective that is useful for the mental model.
groundzeros2015|25 days ago
ChrisSD|25 days ago
The System V ABI is as close as we get to an actual specification but not everyone uses it and in any case it only covers a small part of the protocol.
1vuio0pswjnm7|23 days ago
But Rust evangelists cannot stop people from using any particular language
Assembly is the only language that matters. It's the only language the computer "understands"
Everything other language is just an abstraction over it
When it comes to abstractions not everyone has the same preferences
themafia|25 days ago
We didn't do it to annoy you or to foist bad APIs on you. We did it because it was the best language for writing machine code at the time. By miles. Not understanding why this is true will lead you to make all the same mistakes the languages "bested" by C made.
chuckadams|24 days ago
chuckadams|23 days ago
Write clickbait, get kneejerk. I guess some other things are at least as old as C.
Panzerschrek|25 days ago
That's exactly my case. For my programming language I have wrote a tool for C headers conversion using libclang. And even with help of this library it wasn't that easy, I have found a lot of caveats by trying converting headers like <windows.h>.
orionblastar|25 days ago
TZubiri|25 days ago
There's some people that still develop on C for sure, but it's limited to FOSS and embedded at this point, Low Level proprietary systems having migrated to C++ or Rust mostly.
I agree with the main thesis that C isn't a language like the others, something that we practice, that it's mostly an ancient but highly influential language, and it's an API/ABI.
What I disagree with is that 'critiquing' C is productive in the same way that critiquing Roman Law or Latin or Plato is productive, the horse is dead, one might think they are being clever or novel for finding flaws in the dead horse, but it's more often a defense mechanism to justify having a hard time learning the decades of backwards compatibility, edge cases and warts that have been developed.
It's easier to think of the previous generation as being dumb and having made mistakes that could have been fixed, and that it all could be simpler, rather than recognize that engineering is super complex and that we might as well dedicate our full life to learning this craft and still not make a dent.
I applaud the new generation for taking on this challenge and giving their best shot at the revolution, but I'm personally thinking of bridging the next-next generation and the previous generation of devs, the historical complexity of the field will increase linearly with time and I think if we pace ourselves we can keep the complexity down, and the more times we hop unto a revolution that disregards the previous generation as dumb, the bigger the complexity is going to be.
twangist|25 days ago
bfrog|24 days ago
The real protocol in action here is symbolic linking and hardware call ABIs.
You could always directly call Rust functions, but you'd have to know where to symbolically look for them and how to craft its parameters for example.
If this is well defined then its possible. If its poorly defined or implementation specific (c++) then yeah its a shit show that is not solvable.
president_zippy|24 days ago
"In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, "I don't see the use of this; let us clear it away." To which the more intelligent type of reformer will do well to answer: "If you don't see the use of it, I certainly won't let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.""
-G.K. Chesterton
#1: What you consider a "language" is one with its own official runtime. By your logic, Javascript is not a language either because the interpreter is usually written in C++ (e.g. V8, spidermonkey).
#2: Dogfooding a compiler and a runtime is common practice because most people believe that it will help the makers of a language identify bugs in their implementation or identify undefined behavior in the very definition of their language. However, every language must bootstrap from another language: most languages bootstrapped from a compiler written in C. If you trace the lineage from compiler to compiler to ... compiler, you'll most likely find that the second C compiler ever written was written in C and the first compiler was written in some architecture's assembly language. The same still applies for the new, new language of C99 which is less than half the age of C. C is so mature that rooting out new undefined behaviors is no longer a significant concern.
#3: libc is not just a "language runtime", it's the interface into the OS kernel for most operating systems. The runtime for every "real language" as you describe it ultimately uses libc. It doesn't make sense to separate the functions in the C99 standard from the OS's own syscalls and special-purpose user land functions call C standard library functions, and C's standard library relies on the OS's special functions. If they were separate libraries, there would be a circular dependency.
Think of C not as "not a language" but as a language that serves the unusual purpose of being the foundation for nearly everything else.
You have a point on integer types being a mess, but we have stdint.h for that.
Woodi|25 days ago
Eg. here, from memory:
> ...you want to read 32 bits from file but OH NOOES long is 64 bit ! The language ! The imposibility !
But when you read something ot unserialize some format you just need to know based on format schema or domain knowledge. Simple and straightforward like that ! You do not do some "reflections" on what language standard provide and then expect someone send you just that !!
So that anti-C "movement" is mostly based on brainless exampless.
Not saying C is perfect.
But it is very good and I bet IBM and other big corps will keep selling things written and actively developed in C/C++ + adding hefty consulting fees.
In the meantime proles has been adviced to move to cpu-cycle-eating inferior languages and layers over layers of cycle burning infra in cloud-level zero-privacy and guaranteed data leaks.
Oh, btw. that femous Java "bean" is just object with usually language delivered "basic type"... How that poor programmer from article should know what to read from disc when he just have types Java provides ?? How ? Or maybe he should use some domain knowledge or schema for problem he is trying to solve ??
And in "scripting language" with automatic int's - how to even know how many bits runtime/vm actually use ? Maybe some reflection to check type ? But again how that even helps if there is no knowledge in brain how many bits should be read ?? But calling some cycle burning reflection or virtual and as much as posible indirect things is what fat tigers love the moust :)
black_13|24 days ago
[deleted]
bigbuppo|25 days ago
Ygg2|25 days ago
TZubiri|25 days ago
[deleted]
CGamesPlay|25 days ago
yoshuaw|24 days ago
foltik|24 days ago
SanjayMehta|25 days ago
majorchord|25 days ago
[deleted]
ranger_danger|25 days ago