top | item 19420752

(no title)

lordmauve | 7 years ago

Please don't use "word" to mean 16 bits. In an era when machine words are generally 64 bits, we're not talking about an anachronism from the previous generation, but the one before that - in a language that is completely insulated from the actual machine architecture.

One thing I love about Rust is that it uses u16, u32, u64 etc for unsigned and i16, i32, i64 etc for signed, which is about perfect - clear, concise and future-proof. That would be perfect for this library.

https://en.wikipedia.org/wiki/Word_(computer_architecture)

discuss

order

phkahler|7 years ago

>> One thing I love about Rust is that it uses u16, u32, u64 etc for unsigned and i16, i32, i64 etc for signed, which is about perfect

Yes, even good old C has uint16_t and int16_t for this. I use these exclusively for embedded work because we care about the size of everything. Also agree that Rust gets it right by using a single character with the size: u16, i16.

It's funny because C opted to leave the number of bits machine dependent in the name of portability, but that turns out to have the opposite effect.

ISV_Damocles|7 years ago

> It's funny because C opted to leave the number of bits machine dependent in the name of portability, but that turns out to have the opposite effect.

That depends on what you consider the portable part. In the era of proprietary mainframes operating on proprietary datasets, data portability probably didn't matter as much as code portability to perform the same sort of operation on another machine.

C's size-less `int`, `float`, etc allows the exact same code to compile on the native (high speed) sizes of the platform without any editing, `ifdef`s, etc.

(Side note: That's what bothers me a lot about the language wars -- the features of a language are based on the trade-offs between legibility, performance, and the environment from the era they were intended to be used in. Often both sides of those spats fail to remember that.)

kazinator|7 years ago

> good old C has uint16_t

Firstly, no, good old C doesn't. These things are a rather new addition (C99). In 1999 there was decades of good old C already which didn't have int16_t.

It is implementation-defined whether there is an int16_t; so C doesn't really have int16_t in the sense that it has int.

> It's funny because C opted to leave the number of bits machine dependent in the name of portability, but that turns out to have the opposite effect.

Is that so? This code will work nicely on an ancient Unix box with 16 bit int, or on a machine with 32 or even 64 bit int:

  #include <stdio.h>
  int main(void)
  {
    int i;
    char a[] = "abc";
    for (i = 0; i < sizeof a; i++)
      putchar(a[i]);
    putchar('\n');
    return 0;
  }
Write a convincing argument that we should change both ints here to int32_t or whatever for improved portability.

PorterDuff|7 years ago

I've always wondered if someone wrote a close-to-the-metal C compiler for the CDC 6400.

60 bit words, 18 and 60 bit registers, 6 bit characters.

kazinator|7 years ago

"word" for "16 bits" is not so much an anachronism as an Intel-Microsoftism.

In computing, "word" is understood to be machine dependent: "what is that machine's word size?"

vanderZwan|7 years ago

I suspect the reason for this unfortunate naming has something to do with the fact that strings are encoded in UTF16 in JavaScript. Having said that I completely agree with you.