ndeine | 12 years ago | on: The GHC Runtime System [pdf]
ndeine's comments
ndeine | 12 years ago | on: Common Python Mistakes
>>> x = 1
>>> def a():
...: print(x)
...:
>>> def b():
...: x = 2
...: print(x)
...:
>>> def c():
...: print(x)
...:
>>> a(), b(), c()
1
2
1
>>> x = 3
>>> a(), b(), c()
3
2
3
I think there is an argument to be made that classes are special and "reaching upwards" into the superclass scope should not occur - a unique copy should be made - but I also think that Python's way of doing it makes enough sense that it is not confusing. The Python devs are at least consistent about having their own way of doing things.ndeine | 12 years ago | on: Rust for C++ programmers – part 5: borrowed references
1. The pointer is mutable, but its contents are immutable.
2. The pointer is immutable, but its contents are mutable.
3. The pointer is immutable and the contents are immutable.
4. The pointer is mutable and its contents are mutable.
Right now for owned pointers Rust gives us (3) and (4), but no obvious way to achieve (1) and (2). Although you might argue that the borrowing semantics give us these powers, just not directly with owned pointers - which we shouldn't be using directly if we're asking for that control, we should be lending them out in a well-controlled manner.
ndeine | 12 years ago | on: Common LibreSSL porting mistakes
But let's quote the kernel source on the matter[1], just to be clear:
> The two other interfaces are two character devices /dev/random and /dev/urandom. /dev/random is suitable for use when very high quality randomness is desired (for example, for key generation or one-time pads), as it will only return a maximum of the number of bits of randomness (as estimated by the random number generator) contained in the entropy pool.
> The /dev/urandom device does not have this limit, and will return as many bytes as are requested. As more and more random bytes are requested without giving time for the entropy pool to recharge, this will result in random numbers that are merely cryptographically strong. For many applications, however, this is acceptable.
The real point to be made here is that, yes, /dev/random is theoretically better - but for many applications, letting /dev/random hang to wait for entropy is worse than having /dev/urandom use a CSPRNG in a way that is generally recognized to be secure.
[1]: http://repo.or.cz/w/davej-history.git/blob/d0562c8dc:/driver...
Edit:
I would like to add that the original article is talking about using /dev/urandom to generate long-lived keys, not session keys or similar. In this case, the blocking is sometimes acceptable to generate appropriate entropy, since the fact that the key is long-lived implies that you don't do this very often. The argument for /dev/urandom only holds clout when you are making a tradeoff for non-blocking behavior (which is 99% of the time). As such, there is nothing wrong with being slightly paranoid and using /dev/random if you can afford the time spent collecting entropy.
ndeine | 12 years ago | on: Meet Flappy48, The Clone Game To End Clone Games
ndeine | 12 years ago | on: My Ideas, My Boss’s Property
He is saying that more people on average will respond more directly and immediately to punishment than reinforcement.
The context is that organizations' behavior influence via punishment is a short-term tactic: in the long run, we would like to believe that reinforcement poses a net gain. However cultural influence results in short-term behavior control tactics from organizations prevailing, and little heed paid to the tradeoff.
One might also argue that it is cheaper in the short term to punish than to reward, and this further perpetuates the downward cycle as a staple of organization culture.
ndeine | 12 years ago | on: Low-level is easy
I loaded up the datasheet for the M3's when typing up this reply. 384 pages. Contrast that with the TI 74181 ALU datasheet from the days of old: 18 pages, most of which are just detailed diagrams of the distances between the pinouts. The logic diagrams fit on a single page. You can build a simple CPU using one of these machines in a few hours in your basement.
Hardware is only going to get more complicated. At what point does it become so complicated that no one person can reasonably understand how a computer works "under the hood", even from an abstract level?
ndeine | 12 years ago | on: 0^0
For example: I have done a lot of work on some equation that is interesting to me, and finally I have reduced it to 5+yx=52x+5. Now obviously the rules of algebra let me subtract 5 from each side and be left with yx=52x, and this subtraction also has no effect on the domains for which our variables may be defined. All is well.
But dividing out the x is what we are concerned with now. Surely y=52 is a solution to the equation - why can this not be true for all values of x?
Well, for nonzero x we have y=52 and nobody will complain. For x=0, though, solving for y is problematic. Note that if x=0, y could be 1, or 33, or any number. If there is some function f such that y=f(x), then it follows that f(x) holds a unique value y for each input of x/=0, but for x=0 we cannot know what y might be; this is what we mean by undefined. Thus we say the domain of f(x) is the set of all real numbers x, such that x is not equal to zero.
If you have been told otherwise, or even gotten away with doing algebra or calculus under the assumption that the domain of our function f may include zero, you are taking a mathematical shortcut rather than performing formal analysis. It is not calculus nor algebra that is broken by saying f is undefined for x=0, but rather your (albeit practically useful) misconception of these systems.
I'll finish with some formal rules of algebra, to hammer this in:
- [P6] Existence of a multiplicative identity: a * 1 = 1 * a = a ; 1 /= 0.
- [P7] Existence of multiplicative inverses: a * a^(-1) = a^(-1) * a = 1, for a /= 0.
These are taken from page 9 of Spivak's Calculus, 3rd edition. He goes on to build the foundations of all of calculus from rules like these. Surely he would not present this as a fundamental axiom of his system, only to immediately (and silently) reject it and build a flawed calculus instead!
Indeed, on pg. 41, when defining functions, Spivak later writes (emphasis his):
> It is usually understood that a definition such as "k(x) = (1/x) + 1/(x-1), x /= 0, 1" can be shortened to "k(x) = (1/x) + 1/(x-1)"; in other words, unless the domain is explicitly restricted further, it is understood to consist of all numbers for which the definition makes any sense at all.
ndeine | 12 years ago | on: 0^0
In one of the below posts we have the suggestion
> But what if you're in a context where you're not reasoning about continuous functions at all? Why would you have to be subject to reasoning that doesn't apply to your situation?
In this case you could either do the above, if you have to concern yourself with e.g. a domain of the set of real numbers arbitrarily close to the undefined location. Alternatively you could just define our original function f only for real numbers greater than 0, in which case you escape the necessity of redefining functions to be easier to work with.
[a]: http://www.wolframalpha.com/input/?i=domain+of+f%28x%29+%3D+...
ndeine | 12 years ago | on: Introductions to advanced Haskell topics
ndeine | 12 years ago | on: Microsoft makes source code for MS-DOS and Word for Windows available to public
ndeine | 12 years ago | on: Show HN: Trontium Reactor, the first USB Power Delivery battery
I've had a lot of experience with Li-Ion batteries and more advanced chemistries, and I have to say I'd be scared to try rigging up something that can handle the kind of amps the Reactor is promising.
The only lithium-ion-like batteries I know of that can handle any sort of reasonable amp draws (100W at 5-20V = 5-20A currents) are e.g. really modern LiMn2O4 (hybrid? I'm not sure) cells like [1], not standard LiFePO4 or whatever. So if they happen to have really nice high-draw cells in them, with protection circuitry added to each and all the stuff to make it safe, then they're definitely in the right value range. Keeping batteries from failing dramatically and exploding when subject to abuse, electrical or otherwise, is not trivial.
[1]: http://illuminationsupply.com/batteries-c-48_50/18650-sony-u...
ndeine | 12 years ago | on: Could keybase.io do for crypto what GitHub did for Git?
The other target userbase is people who want to verify GPG keys that don't have a web of trust. This is useful for, say, me - I'm not part of the Debian dev team or anything, so I don't have many people around who can sign my GPG key. It's nice to be able to publish a link to Keybase on my website and have people be able to be pretty sure it's me.
ndeine | 12 years ago | on: Git: how to use stash
> [...], apply the changes (patch) from the file and continue the work. While it is not something difficult, it can be done much easier with Git.
Since he's saying "it can be done much easier with Git", the implication is that he wasn't using Git earlier, and the workflow with `git stash` is far cleaner than that diff/patch workflow.
At least, I think that's the case. I really can't imagine someone doing that instead of just using Git and `git stash` in the first place.
ndeine | 12 years ago | on: Curse Of The Gifted (2000)
When the AP test rolled around I was really struck. The problems were about the flow rate of water descending into a tank and what happens when you put a drain here with this rate of flow, etc. Completely different from our earlier problems which could be solved by rote memorization. And from my perspective - one of a student who knew a bunch of calculus "rules" but lacked understanding of how calculus really worked - this was incredibly difficult.
I got a 3, which is passing, but opted to take beginner-level engineer's calculus when I got to college anyway. Even if you "learn" calculus in high school, it does not teach the level of mathematical maturity required to understand the higher level manipulations. This kind of thought is the foundation for any good engineer, and I have no regrets about retaking the course.