gitgudnubs's comments

gitgudnubs | 5 years ago | on: Paxos vs. Raft: Have we reached consensus on distributed consensus?

The part I found non-obvious is how to snapshot the unbounded vectors of immutable registers, which is a requirement for any real system. But I never gave it a whole lot of thought either.

To snapshot, I either need to find an epoch such that no older epoch could ever have relevant information for the current node, or I need the nodes to agree on a snapshot. Both seem complicated.

gitgudnubs | 5 years ago | on: Wi-Fi over Coax

Only an idiot would design a device to assume no packet loss, unless they were _also_ allowed to design the network. If you buy such a device and put it on a janky power network, then you have only yourself to blame.

Standards decisions should never be made to accommodate designers that want to assume there's effectively no packet loss.

Some degree of error correction is totally reasonable, but your first point is irrelevant to discussion.

gitgudnubs | 5 years ago | on: Wi-Fi over Coax

I can't believe you had to pay a fine to comply with the government's own laws. No reasonable person would expect an operation to make no mistakes here.

How big were the fines? Were they ever challenged in court?

gitgudnubs | 6 years ago | on: Senate Stock Watcher

He's suggesting we pay them better to make up for depriving them of vehicles private citizens can use. It's not unreasonable to strip them of those vehicles to prevent insider trading. It's not unreasonable to compensate them for losing those vehicles.

gitgudnubs | 6 years ago | on: Covid-19 twice as contagious as previously thought – CDC study

Health systems deal with scarcity on a daily basis. There's no room for the emotional hand-wringing you're describing. Moving scarce resources from diagnosis patients to studying the population during a pandemic will be one event in a causal chain that results in deaths. It also prevents the outbreak from spreading beyond control.

Several orders of magnitude more people have been killed by uninformed policy than would have been killed by redirecting a portion of tests. What kind of MONSTER chooses for so many more people to die?!?

gitgudnubs | 6 years ago | on: Covid-19 twice as contagious as previously thought – CDC study

Do you know what's even worse for the hospital's ability to treat patients? If the city continues to operate normally until the number of cases is so large that it's obvious the outbreak is unmanageable even without testing.

Randomized tests would have told policy makes exactly how fast this spreads, even in Western cities. Lockdowns would have happened earlier. The total number of infected would be 1-2 orders of magnitude lower. Hospitals would have been better off.

>Would you want one of your loved ones in that position?

I wouldn't want one of my loved ones to die because my civilization was so short-sighted that it let a disease run rampant. Your appeal to emotion is garbage, and it doesn't even make sense, since all of our chances would be better if we'd known what was going on.

gitgudnubs | 6 years ago | on: Amazon fires worker who led strike over virus

Fallacious. A headache is evidence of a brain tumor, but there's not a 51% chance you have a brain tumor. You've satisfied some necessary conditions for retaliatory action, but haven't converted that into a probability.

You have a reasonable indication, but no preponderance of evidence. You probably have enough for discovery.

gitgudnubs | 6 years ago | on: Amazon fires worker who led strike over virus

Preponderance of evidence is the bar that must be met. But the plaintiff must provide the evidence to the courts. The discovery process makes some of the defendant's records available to the plaintiff, in case there is relevant evidence.

But if the plaintiff produces no evidence, Amazon does not need to make a defense. Thus OP is correct.

gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come

Brick. Wall.

>Re-read the last line of my original post.

Still wrong. Reread the second to last line of your original post.

>There's a reason why screening tests need to be highly accurate.

They don't. That's the point. It doesn't need to be 99.9% accurate unless the disease is so aggressive that you're all dead anyway.

There's the contention. The goal is exponential decay, and the only scalable method is to reduce the reproduction rate. Even if you don't drop below 1, a smaller exponential still grows exponentially slower.

gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come

It's like talking to a brick wall. Medical technology is approximately monotonically increasing, so it's absurd to think the test we cobble together in the first month won't be improved upon. The CDC FUCKED UP. I expect careers to be finished. I want a criminal investigation, just in case of corruption, and I hope some people lose their license.

But even in the current environment, testing is important. It guides public policy, and it helps doctors triage patients in serious condition. It still helps to limit the spread, and will do so even more when testing overtakes the infection rate (aided by lockdowns). If testing improves, then it could cut weeks off a global lockdown, because the long tail will die that much faster.

Not everyone is a neckbeard with a programming job who can live in a basement for two weeks without a single human contact, so anything that helps extinguish this disease helps. Now if you'll excuse me, I should go shave.

gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come

My favorite game.

Are you aware that the test can be improved by training medical personnel on sampling technique?

Are you aware that tests will, as a general rule, improve?

Are you aware that the majority of people understand the difference between 70% and 100% accurate?

Are you aware that public policy was affected by poor data on the spread?

Are you aware that it's far easier to successfully test [nearly] everyone in the chain when the number of cases is small, like it would have been when this test was initially available?

The _only_ way this can be contained is through massive lockdowns. If tests were performed _as soon as possible_, then there's a good chance we could have done a good job containing it through contact tracing. Even if that weren't the case, it would have given our medical system several extra weeks to prepare for the case load. It would have given our politicians better data to enact policies.

gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come

If the reproduction rate is 10, and you catch 70% of cases, then the effective reproduction rate will be closer to 3. Since even the highest estimates of the basic reproduction number are only about 4, a test that's 75% accurate means an effective reproduction rate closer to 1.

Are you aware of the difference between 4^x and 1^x?

gitgudnubs | 6 years ago | on: Secure by Design

Creating perfect types for every single input is also an anti-pattern. Most of our needs fall somewhere between "works pretty well" and "formally verified". Riding herd on the type system to prove ever more invariants about your system is usually a waste of time.

It's reasonable to put a little more effort into it along API lines. But there's a reason that the compiler doesn't make it easy to define an integer type that can hold values between -7 and 923091.

gitgudnubs | 6 years ago | on: Secure by Design

If you choose an arbitrary limit orders of magnitude above normal use, then you probably don't have any protection. Most systems are scaled to reasonable use, so an additional 1000x load in a dimension could bowl over the system.

Even defining "normal use" is intractable. For instance, most docker layers are a few MB, but some people are deploying 3P software packaged as a container with 10 GB in a single layer. You can't fix their container. They can't fix their container. Your definition of reasonable changes, and you bump your maximum to 1 TB. Then someone is trying to deploy docker containers that run VMs, which have 1.5 TB images. It's to interface with legacy systems that are infeasibly difficult to improve. But the vhd is a single sile, so now you have a single layer maximum size of 1.5 TB. But since the 10 GB body size is a possible attack vector in and of itself, what's the security benefit of having any maximum size limit at this point?

It's the wrong approach. Instead, your system should gracefully handle objects of arbitrary size. Security should be enforced by cryptographically enforced access controls and quotas.

gitgudnubs | 6 years ago | on: New flaw in Intel chips lets attackers slip their own data into secure enclave

There are more hardware exploits. New classes of exploits using physics and side-channels are circumventing the formal models used to build CPUs.

CPU designers have made complex architectural decisions to speed up execution. In the case of spectre, it's to speed up single-threaded execution. In the case of this, it's to optimize security features. An analogous case is AES256, which was chosen because it's fast. But it's fast because the s-boxes use the private key as an index into an array, so there's caching. But this introduces a side-channel, because based on time to execute you can infer the private key.

gitgudnubs | 6 years ago | on: New flaw in Intel chips lets attackers slip their own data into secure enclave

Almost, but plenty of computer architects speculated about vulnerabilities in speculative execution. It just seemed infeasible. The attack can differ based on the particular architecture (cache hierarchy, associativity, latency per instruction, buffer sizes...), clock speed, microcode, workloads, temperature, and a thousand other variables.

Spectre was more impressive than a new idea: it was a brilliant execution of an idea that every architect eventually had. Rowhammer was similar. Everyone knew that it was possible to get boned by physics, but it can happen at an arbitrary place in an arbitrary way that isn't captured by any model. Rowhammer wasn't impressive because it was an idea, but because it was a simple, obvious in retrospect, way to exploit physics to bypass the models.

page 1