bumper_crop's comments

bumper_crop | 3 years ago | on: CNIL makes Google Analytics almost illegal in France

This is great news! For far too long, Website owners have been collecting data on their users at no benefit to the users themselves. When website owners try to collect data on their users (for any and all reasons) it just violates the privacy of those people and needs to be put to an end. Those French website runners should really create their own, CNIL and GDPR compliant anonymized data storing, rather than using off the shelf, low cost alternatives. After all, things have been a bit too easy for them. (Running a website is pretty easy, I would know!). In fact, The fact that other, compliant-data aggregators, offer fewer features and lower reliability is actually a good thing. Trying to improve your website or even pester me with whatever you made is just irritating spam; I can't believe those independent owners would even dare. They should just be flushed out of existence.

HEY! Why is everything being centralized to just a few services? Why is the web dying?!

bumper_crop | 3 years ago | on: A brief history of nobody wants to work anymore

Do you talk with regular people, outside of the tech bubble often? If money were really the problem, Walmart employees would jump at an opportunity to work at Costco or any of the other higher paying places. OP is claiming that because salaries aren't high enough, people are deciding to stay home and watch Netflix.

bumper_crop | 3 years ago | on: A brief history of nobody wants to work anymore

This is lazy explanation that's easy to agree with if you don't think very hard about it. Two reasons:

1. A shitty job at $15 an hour is equally unappealing at $16 an hour. Someone who doesn't want the crap hours, crap boss, and crap customer interactions isn't going to change their mind over an extra buck an hour. The money isn't the problem, it's that companies make it _not rewarding_ to work.

2. Raising pay ratchets up inflation, and isn't a response. Cause and effect are reversed. Unlike food, gas, houses, movie tickets, and restaurants, the amount of money transacted in wages/salary cannot go down. McDonald's can't tell their employees that due to supply and demand, they are only going to make $14.83 an hour this week. Imagine if your company lowered wages due to the "expected recession". You would start looking for a new role. This means pay increases can never go down again after a temporary bump. I won't say it's the cause, but it artificially limits deflation from ever pulling things back. Like an elevator that can only go up.

bumper_crop | 3 years ago | on: Learning Go as a Python Developer: The Good and the Bad

> I hate how it forces to you create a module everytime.

This affected me recently, so I have sympathy for the author. Trying to upgrade an older project I had to the module system meant trying to find out how to import modules which don't have reachable URLs and were only on the GOPATH. At I hate how it forces to you create a module everytime. some point programming in Go stopped being for fun.

bumper_crop | 3 years ago | on: Ask HN: Where and how do you find your early adoptors?

Step 1: have a lot of friends. You can usually spend a little politcal capital to get your first few users/customers by talking to them directly. For products that are targeted towards businesses, you'll need to call up your buddies from previous jobs and talk to them about it. After that, you'll probably need to hire a BDR or an AE to start looking for the next few. For B2C, I am not familiar.

bumper_crop | 3 years ago | on: History of lossless data compression algorithms (2014)

Very timely; I found out yesterday that the UPX program (EXE specific compressor from many years ago) was made by the same guys who made LZO. I had this realization that there is a progeny from people who write compression stuff. In a similar vein, The Rob Pike post from yesterday mentioned Ken Thompson had made a custom audio compression for the Plan 9 release. He also made the UTF-8 compressor too. I love seeing how these people keep refining the state of the art.

bumper_crop | 3 years ago | on: JVM Anatomy Quark #10: String.intern (2019)

You'll need to go back earlier than 15 years. ConcurrentHashMap and friends were added in 1.6, but String.intern has been in there since the beginning. Since Java shipped with Threads in the standard library, (but no memory model) that meant it would have been very difficult to do concurrent String deduplication yourself. If you agree string deduplication is needed, then String.intern() was a good implementation for a long while.

String.intern() also offers some other benefits for certain use cases. Earlier versions of java did not cache the String hashcode, which meant that to use Strings as hash table keys meant hashing a lot more. But, an interned string can be used in an IdentityHashMap, which was faster for a long portion of Java's early life.

(I worked on a moderately popular Java library that targeted Java 1.5 as the minimum version. It does occasionally come up useful, but only in specific, and increasingly rare circumstances)

bumper_crop | 3 years ago | on: Apple's feedback mechanism is broken

Independent of Apple, I think we need an industry wide of saying "I'm not an idiot, this bug report is real". I've been on both sides (in a moderately used OSS project). The main problem is that the attending doesn't have a good way to filtering the noise from the signal. As a result, the likes of Apple (and the other FAANGs) implement these aggregate-and-discard blackholes for bug reports. "Only a 0.1% increase in crashes? Ship it" is the way the story goes sadly.

bumper_crop | 3 years ago | on: Data Race Patterns in Go

Make sure to read between the lines. It only looks like a busy loop. Remember, the OS can pause and preempt your thread at any time. This is a real and likely event.

bumper_crop | 3 years ago | on: CockroachDB's Consistency Model

One of the things that made linearizabilty click for me was thinking in terms of happens-before:

    volatile int a;
    a = 1;
    print(a) // Could this print Zero?
Even assuming only one program, one thread, one process, no interleaving and nothing fishy, could the final line print 0? In the serializable consistency, the answer is yes. Linearizability is framed in terms of clocks, but really that's just trying to establish that one thing happened before another. The "clock" in this example is the line number.

bumper_crop | 3 years ago | on: Data Race Patterns in Go

https://go.dev/play/p/xolc9oPwA0C

Interfaces don't have a zero type, which means that we can't have an atomic.Value which stores Shape. Atomic Value would be much easier to reason about if it had store semantics similar to a regular `var foo Shape = ...`. One of the other comment threads talked about generics helping this, so maybe there is hope.

bumper_crop | 3 years ago | on: Data Race Patterns in Go

That's why I opened with "Look at the implementation". Go is unable to store the type and the pointer at the same time, so it warps what "atomic" means. Pretty much every other language has atomic mean "one of these will win, one will lose". Go says "one will win, one will panic and destroy the goroutine.

In fact, it's even worse than that. If the Store() caller goes to sleep between setting the type and storing the pointer, it causes every Goroutine that calls Load() to block. They can't make forward progress if the store caller hangs.

bumper_crop | 3 years ago | on: Data Race Patterns in Go

Sorry to say, but these hit close to home for me. A lot of the synchronization paradigms in Go are easy to misuse, but lead the author into thinking it's okay. the WaitGroup one is particularly poignant for me, since the race detector doesn't catch it.

I'll add one other data race goof: atomic.Value. Look at the implementation. Unlike pretty much every other language I've seen, atomic.Value isn't really atomic, since the concrete type can't ever change after being set. This stems from fact that interfaces are two words rather than one, and they can't be (hardware) atomically set. To fix it, Go just documents "hey, don't do that", and then panics if you do.

bumper_crop | 3 years ago | on: Always Own Your Platform (2019)

From 2006 to 2014 I owned my own platform. Several actually. But I turned them down after it became a lot of work to maintain. At the time, it wasn't so obvious the web was dying, but in hindsight I probably helped kill it.

In the beginning the web was so new, and growing so fast, with new things, amazing sites, and more people getting online. Like all things in life, competition arises, and better sites started getting much more of the market share. People's expectations for what a website could offer rose tremendously, and would abandon a site if it wasn't up to snuff. Sites needed to have ever improving visuals, better content, better people, better interactivity, better everything.

And I couldn't keep up with it. Users went from being happy to try out something new to dismissive and bitter. More and more it felt like work to try to make them happy, to keep building more and better things. And that's exactly what happened. The Internet became work. It's why we all have to be paid to come to work and build the Internet. No one does it for free, because it's a thankless grueling job. The only websites that survived were the ones that made money, and could afford to use that money to hire people. Google, Facebook, Myspace, Stumbleupon, 9gag, and even Something Awful became money oriented rather than community oriented. They had to, or else.

The advice to "Always Own Your Own Platform" is a euphemistic way of saying make a whole company out of your site and underpay the only employee (you) for ever. The reason we don't own our own platform anymore is because it's soooo annoying to do so. It wasn't an accident.

bumper_crop | 3 years ago | on: What’s the best lossless image format?

The best lossless format is the one you can decode back to the original image. When evaluating them, there is an implicit assumption that the decode step will happen a short time later, but that's not always true. Will there be JPEG XL decoders commonly available in 40 years? Will the code even compile then? As a thought experiment, try to find the lossless encodings from 40 years ago and see if they can be faithfully reproduced. (or even 20 years ago, remember JPEG2000?)

Framing best in terms of file size or encoding speed is a really use-specific framing, and not ideal for preservation.

page 1