gitgudnubs | 5 years ago | on: Paxos vs. Raft: Have we reached consensus on distributed consensus?
gitgudnubs's comments
gitgudnubs | 5 years ago | on: Paxos vs. Raft: Have we reached consensus on distributed consensus?
To snapshot, I either need to find an epoch such that no older epoch could ever have relevant information for the current node, or I need the nodes to agree on a snapshot. Both seem complicated.
gitgudnubs | 5 years ago | on: Wi-Fi over Coax
Standards decisions should never be made to accommodate designers that want to assume there's effectively no packet loss.
Some degree of error correction is totally reasonable, but your first point is irrelevant to discussion.
gitgudnubs | 5 years ago | on: Wi-Fi over Coax
How big were the fines? Were they ever challenged in court?
gitgudnubs | 6 years ago | on: Why are Soviet math textbooks so hardcore in comparison to US textbooks? (2017)
gitgudnubs | 6 years ago | on: Senate Stock Watcher
gitgudnubs | 6 years ago | on: Covid-19 twice as contagious as previously thought – CDC study
Several orders of magnitude more people have been killed by uninformed policy than would have been killed by redirecting a portion of tests. What kind of MONSTER chooses for so many more people to die?!?
gitgudnubs | 6 years ago | on: Covid-19 twice as contagious as previously thought – CDC study
Randomized tests would have told policy makes exactly how fast this spreads, even in Western cities. Lockdowns would have happened earlier. The total number of infected would be 1-2 orders of magnitude lower. Hospitals would have been better off.
>Would you want one of your loved ones in that position?
I wouldn't want one of my loved ones to die because my civilization was so short-sighted that it let a disease run rampant. Your appeal to emotion is garbage, and it doesn't even make sense, since all of our chances would be better if we'd known what was going on.
gitgudnubs | 6 years ago | on: Covid-19 twice as contagious as previously thought – CDC study
If policy had been informed, the number of people saved would have out-weighed the handful who died due to missing out on a test.
gitgudnubs | 6 years ago | on: Amazon fires worker who led strike over virus
You have a reasonable indication, but no preponderance of evidence. You probably have enough for discovery.
gitgudnubs | 6 years ago | on: Amazon fires worker who led strike over virus
But if the plaintiff produces no evidence, Amazon does not need to make a defense. Thus OP is correct.
gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come
>Re-read the last line of my original post.
Still wrong. Reread the second to last line of your original post.
>There's a reason why screening tests need to be highly accurate.
They don't. That's the point. It doesn't need to be 99.9% accurate unless the disease is so aggressive that you're all dead anyway.
There's the contention. The goal is exponential decay, and the only scalable method is to reduce the reproduction rate. Even if you don't drop below 1, a smaller exponential still grows exponentially slower.
gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come
But even in the current environment, testing is important. It guides public policy, and it helps doctors triage patients in serious condition. It still helps to limit the spread, and will do so even more when testing overtakes the infection rate (aided by lockdowns). If testing improves, then it could cut weeks off a global lockdown, because the long tail will die that much faster.
Not everyone is a neckbeard with a programming job who can live in a basement for two weeks without a single human contact, so anything that helps extinguish this disease helps. Now if you'll excuse me, I should go shave.
gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come
Are you aware that the test can be improved by training medical personnel on sampling technique?
Are you aware that tests will, as a general rule, improve?
Are you aware that the majority of people understand the difference between 70% and 100% accurate?
Are you aware that public policy was affected by poor data on the spread?
Are you aware that it's far easier to successfully test [nearly] everyone in the chain when the number of cases is small, like it would have been when this test was initially available?
The _only_ way this can be contained is through massive lockdowns. If tests were performed _as soon as possible_, then there's a good chance we could have done a good job containing it through contact tracing. Even if that weren't the case, it would have given our medical system several extra weeks to prepare for the case load. It would have given our politicians better data to enact policies.
gitgudnubs | 6 years ago | on: CDC coronavirus testing decision likely to haunt nation for months to come
Are you aware of the difference between 4^x and 1^x?
gitgudnubs | 6 years ago | on: Secure by Design
It's reasonable to put a little more effort into it along API lines. But there's a reason that the compiler doesn't make it easy to define an integer type that can hold values between -7 and 923091.
gitgudnubs | 6 years ago | on: Secure by Design
Even defining "normal use" is intractable. For instance, most docker layers are a few MB, but some people are deploying 3P software packaged as a container with 10 GB in a single layer. You can't fix their container. They can't fix their container. Your definition of reasonable changes, and you bump your maximum to 1 TB. Then someone is trying to deploy docker containers that run VMs, which have 1.5 TB images. It's to interface with legacy systems that are infeasibly difficult to improve. But the vhd is a single sile, so now you have a single layer maximum size of 1.5 TB. But since the 10 GB body size is a possible attack vector in and of itself, what's the security benefit of having any maximum size limit at this point?
It's the wrong approach. Instead, your system should gracefully handle objects of arbitrary size. Security should be enforced by cryptographically enforced access controls and quotas.
gitgudnubs | 6 years ago | on: US admits 'failing' on testing, says Fauci
gitgudnubs | 6 years ago | on: New flaw in Intel chips lets attackers slip their own data into secure enclave
CPU designers have made complex architectural decisions to speed up execution. In the case of spectre, it's to speed up single-threaded execution. In the case of this, it's to optimize security features. An analogous case is AES256, which was chosen because it's fast. But it's fast because the s-boxes use the private key as an index into an array, so there's caching. But this introduces a side-channel, because based on time to execute you can infer the private key.
gitgudnubs | 6 years ago | on: New flaw in Intel chips lets attackers slip their own data into secure enclave
Spectre was more impressive than a new idea: it was a brilliant execution of an idea that every architect eventually had. Rowhammer was similar. Everyone knew that it was possible to get boned by physics, but it can happen at an arbitrary place in an arbitrary way that isn't captured by any model. Rowhammer wasn't impressive because it was an idea, but because it was a simple, obvious in retrospect, way to exploit physics to bypass the models.
I've implemented the consensus part of Paxos in 10 minutes. But it's a toy. It's totally useless without the other stuff.