sjolsen | 6 years ago | on: What Is Null? (2010)
sjolsen's comments
sjolsen | 8 years ago | on: An Open Letter to the FCC
>Since the internet is instead a broadcast medium, it is much easier for big players to saturate the lines
The Internet is _not_ a broadcast medium. Data transfer over the Internet is by nature peer-to-peer.
>ISPs naturally believe that _someone_ should be paying for what they interpret as the "extra load" from these big players
Someone _is_ paying for the load for big players: the consumers who are accessing their sites.
>some households will use 30Mbit/s streaming content to 4 devices for 10 hours per day, and the next household, which pays the same monthly internet bill, uses 5Mbit/s or less to read predominantly-text based sites for 2 hours per day
30 Mb/s residential service already costs more than 5 Mb/s residential service.
>Net neutrality is fundamentally large internet companies looking to keep their costs down by using government force to prevent ISPs from charging them usage-based "carriage fees" or similar.
Net neutrality is fundamentally a consumer protection preventing ISPs from leveraging their natural monopolies on last-mile service to enact rent-seeking policies. Whether ISPs charge consumers directly for access to, say, competing content providers ("Watch Turner Classic Movies for free with Spectrum Basic 10 Mb/s, or upgrade to Spectrum Premium 10 Mb/s for only $19.99 per month to access Netflix, Hulu, and HBO!"), or they charge those content providers extra to peer their traffic and those content providers fold that into subscription fees, we're the ones getting bent over the barrel.
sjolsen | 9 years ago | on: How real are real numbers? (2004)
What do you mean by "arbitrary?" Do you mean uncomputable? How does analysis require the existence of uncomputable reals?
>All numbers ever written are rationals and thus countable
It is also possible to "write" computable irrational numbers, more or less by definition.
sjolsen | 9 years ago | on: How real are real numbers? (2004)
It depends, as they say, on what the definition of is is. What does it mean for a number to exist if it cannot be described? What does it mean for a construction to exist if it cannot be constructed?
>if you confuse extant with useful you might end up believing that some random large integers aren't "there!"
The question isn't whether it's useful to claim the existence of uncomputable reals; the question is whether it's meaningful. A construction needn't be useful to be meaningful, but surely it must be meaningful to be useful.
sjolsen | 9 years ago | on: Wind and solar power are disrupting electricity systems
No. I refuse to accept a standard of discourse where one cannot even point out a problem without having already solved it. There are many, many hard problems with complex systems in the world, and no person, much less a complete outsider, can hope even to begin to address more than a very, very small number of them even if they dedicate their life to the task. Saying "fix it or shut up" is asinine.
sjolsen | 9 years ago | on: Wind and solar power are disrupting electricity systems
If I had a dollar for every time I saw a legitimate complaint about a complex system met with, "Well just reinvent it yourself from the ground up if you're so fucking smart," I'd be a rich man. It is not a constructive suggestion.
sjolsen | 9 years ago | on: C++ on Embedded Systems
There are exceptions (no pun intended) to this, namely exceptions, RTTI, global construction/destruction, and the intricacies of new/delete. As the article points out (as will anyone who advocates the use of C++ in constrained environments), none of these are really necessary to use C++ effectively and can be safely stubbed out.
sjolsen | 9 years ago | on: Rust Sucks If I Fail to Write X
sjolsen | 9 years ago | on: Rust Sucks If I Fail to Write X
Data structures are exactly the abstractions built on top of raw memory. Rust is not really geared toward working with raw memory; it's geared toward working with abstractions that hide the raw memory (i.e., data structures). That's why writing data structures in Rust is hard, and it's also why that fact doesn't imply that "other things" (i.e., code that is not part of a data structure implementation) are hard.
In other words, the fundamental flaw with the assumption that there is a relationship between the difficulty of implementing data structures in Rust and the difficulty of writing applications in Rust is that the two involve very different programming models, and Rust has much more ergonomic support for one than the other.
sjolsen | 9 years ago | on: I Do Not Know C: Short quiz on undefined behavior (2015)
Depending on what you mean precisely by "x86," there is such a thing as an invalid address: the IA32e architecture (or whatever you want to call Intel's flavour of 64-bit "x86") requires that the <n> high bits of an address match, where <n> is machine-dependent.
sjolsen | 9 years ago | on: I Do Not Know C: Short quiz on undefined behavior (2015)
The shortcoming of this interpretation is that programs are not (only) consumed by humans; they're consumed by computers as well. Computers are not at all like humans: there is no such thing as "understanding" or "obtuseness" or even "ideas." You cannot reasonably rely on a computer program, in general, to take arbitrary (Turing-complete!) input and do something reasonable with it, at least not without making compromises on what constitutes "reasonable."
Along this line of thinking, the purpose of the standard is not to generate assembly code; it's to pin down exactly what compromises the compiler is allowed to make with regards to what "reasonable" means. It happens that C allows an implementation to eschew "reasonable" guarantees about behavior for things like "reasonable" guarantees about performance or "reasonable" ease of implementation.
Now, an implementation may choose to provide stronger guarantees for the benefit of its users. It may even be reasonable to expect that in many cases. But at that point you're no longer dealing with C; you're dealing with a derivative language and non-portable programs. I think that for a lot of developers, this is just as bad as a compiler that takes every liberty allowed to it by the standard. The solution, then, is not for GCC and LLVM to make guarantees that the C language standard doesn't strictly require; the solution is for the C language standard to require that GCC and LLVM make those guarantees.
Of course, it doesn't even have to be the C language standard; it could be a "Safe C" standard. The point is that if you want to simultaneously satisfy the constraints that programs be portable and that compilers provide useful guarantees about behavior, then you need to codify those guarantees into some standard. If you just implicitly assume that GCC is going to do something more or less "reasonable" and blame the GCC developers when it doesn't, neither you nor they are going to be happy.
sjolsen | 9 years ago | on: I Do Not Know C: Short quiz on undefined behavior (2015)
That said, I would agree that, on the whole, C leans too heavily on under-specified behavior of every variety. It's just not an absolute.
sjolsen | 9 years ago | on: I Do Not Know C: Short quiz on undefined behavior (2015)
sjolsen | 9 years ago | on: I Do Not Know C: Short quiz on undefined behavior (2015)
sjolsen | 9 years ago | on: I Do Not Know C: Short quiz on undefined behavior (2015)
But that's exactly what undefined behavior means.
The actual problem is that programmers are surprised-- that is, programmers' expectations are not aligned with the actual behavior of the system. More precisely, the misalignment is not between the actual behavior and the specified behavior (any actual behavior is valid when the specified behavior is undefined, by definition), but between the specified behavior and the programmers' expectations.
In other words, the compiler is not at fault for doing surprising things in cases where the behavior is undefined; that's the entire point of undefined behavior. It's the language that's at fault for specifying the behavior as undefined.
In other other words, if programmers need to be able to rely on certain behaviors, then those behaviors should be part of the specification.
sjolsen | 9 years ago | on: Zero Cost Abstractions
for window in buffer.sliding_window(coefficients.len()) {
let prediction = coefficients.iter()
.zip(window)
.map(|(&c, &s)| c * s as i64)
.sum::<i64>() >> qlp_shift;
let delta = buffer[i];
buffer[i] = prediction as i32 + delta;
}
with sliding_window returning an iterator over slices of the buffer?sjolsen | 9 years ago | on: The Deconstructed Standard Model Equation
Because it is concise, precise, and widely accepted.
>Given the option, most development teams would choose to read and write against verbose source code, rather than scrape obfuscated variables and method signatures out of a minified, transpiled, compressed package.
What's your point? Source code is not the same thing as mathematics, not for the majority of software and not for the majority of mathematics as practiced by mathematicians. Verbosity makes sense when your domain involves concrete entities like "customers" and "widgets" and "thermal sensors." When your domain involves abstract entities like "ring homomorphisms" and "clopen sets" and "vector spaces," it doesn't. If you're writing a database, a name like "transaction_mutex" is more descriptive than "m." If you're universally quantifying over the domain of an arbitrary continuous function on the reals, "x" is about as descriptive -- and as conventional -- as any name you can come up with.
As an aside, I really wish we would dispense with the non-word "transpile." We have a word for translating a program from one language to another: compile.
>So why do we continue this archaic practice of obscure, inscrutable symbols in mathematics?
Because it works very well. I'm not sure what else to say. You spend maybe fifteen minutes learning about, say, the symbol "∂" when you're introduced to multivariate calculus, and then for the rest of time you have an extremely concise way of expressing a variety of combinations of partial derivatives that can be understood by anyone who has also been introduced to multivariate calculus.
Don't get me wrong, there are actual problems with mathematical notation -- overloading, "abuse of notation," and as often as not just plain omitting information -- but the use of non-ASCII symbols and short variable names are not among them.
sjolsen | 9 years ago | on: Show HN: Accelerating SHA256 by 100x in Golang on ARM
sjolsen | 9 years ago | on: Introducing Vulkan-Hpp – Open-Source Vulkan C++ API
People have been aware of this problem for some time. It's one of the reasons the C++ community is trying to develop a proper module system (or was the last time I checked, which was a while back).
sjolsen | 9 years ago | on: What If I Don't Actually Like My Users? (2008)
I understand pointers. I understand integer-based encoding. I'm a firmware programmer, I started with FORTRAN and C-with-classes-style C++, I get how computers work. And do you know what makes computers work? Abstractions. Integers are just an abstraction on top of bit vectors are just an abstraction on top of memory are just an abstraction on top of flip-flops, on top of gates, transistors, digital circuitry, analog circuitry, the laws of electromagnetism -- the only reason we are able to construct something as complex as a pocket calculator and have it work at all is clear, mathematical abstraction. The fact that, in 2016, using such a fundamental abstraction as equality of non-numerical values requires anything beyond the two characters "==" is patently absurd.
My "frustration" is not a result of having trouble grasping first-year C.S. concepts. Nor is it the result of the mild incovenience of specifically having to use "strcmp" instead of "==". It's the result of working in an industry where the value of anything beyond first-year C.S. concepts is completely unrecognized, because it's not "how computers fundamentally operate." Both low-level and high-level computational abstractions are useful, and trying to accomplish anything interesting in a system which completely eschews one class of abstraction in favor of the other, whether it's C or Haskell, is hell.
But hey, it beats the hell out of Javascript :)
Or where there's perfectly valid memory mapped to address 0