(no title)
hashmash | 4 months ago
Having more type conversion headaches is a worse problem than having to use `& 0xff` masks when doing less-common, low-level operations.
hashmash | 4 months ago
Having more type conversion headaches is a worse problem than having to use `& 0xff` masks when doing less-common, low-level operations.
fleventynine|4 months ago
The same way you pass a 64-bit integer to a function that expects a 32-bit integer: a conversion function that raises an error if it's out of range.
hashmash|4 months ago
When trying to adapt a long to an int, the usual pattern is to overload the necessary methods to work with longs. Following the same pattern for uint/int conversions, the safe option is to work with longs, since it eliminates the possibility of having any conversion errors.
Now if we're taking about signed and unsigned 64-bit values, there's no 128-bit value to upgrade to. Personally, I've never had this issue considering that 63 bits of integer precision is massive. Unsigned longs don't seem that critical.
MBCook|4 months ago
I think the only answer would be you can’t interact directly with signed stuff. “new uint(42)” or “ulong.valueOf(795364)” or “myUValue.tryToInt()” or something.
Of course if you’re gonna have that much friction it becomes questionable how useful the whole thing is.
It’s just my personal pain point. Like I said I haven’t had to do it much but when I have it’s about the most frustrating thing I’ve ever done in Java.