top | item 36115285

(no title)

vctrnk | 2 years ago

(bit of a rant here, you've been warned)

As someone who could've developed an IT/programming career, but didn't because I felt things were already bloating back in the '00s, I agree with the majority: "harvesting your own food" can be rewarding but also a tedious and thankless job. It's certainly not for everyone, but if it works for some people then it is (let's put efficiency aside for a moment) perfectly valid. In fact, being more of a H/W guy I find myself gravitating towards this approach more often than not. Leanness and reproducibility is key for my workflow (I went the RF-world path), I can't afford different end results when a dependency changes/breaks something.

IMHO, keeping up with the modern paradigms for S/W development looks like a never-ending nightmare. Yes it's the modern way, yeah it's the state of the art. Still, I didn't feel it was a wise investment of my time to learn all those "modern dev" ropes, and I still feel that 20 years later. I'm nowhere near antiquated and I'm on top of all things tech (wouldn't read HN otherwise), it's just...

I see former friends/classmates that went this way, and they're in a constant cat-and-mouse game where 50% of time they're learning/setting up something dev-chain related, the rest 50% doing actual work, and 98% of it feeling way too stressed. I see modern Android devices with their multi-MB apps, bloated to hell and beyond for a simple UI that takes ages to open on multi-core, multi-GHZ SOCs. I see people advocating unused RAM is wasted RAM, never satisfied until every byte is put to good use, reluctant to admit that said good use is just leaving the machine there "ready" to do something, but not doing anything _productive_ actually.

And yet.

Without that bloat, without the convienience of pre-made libraries and assist tools for almost every function one could desire, we wouldn't be where we are now. Imagine for a moment doing AI-work, 3D movie rendering, data science etc. with a DBless approach on single-core machines with every resource micro-managed to eke out the most performance. It's simply not feasible, we would still be on the 90s... just a bit more hipster.

This article resonates so well with me. And at the same time, it feels so distant.

discuss

order

akkartik|2 years ago

I was trying to square the same tension in my mind when I made OP. And the compromise I arrived at was, "try to find people with complementary interests to organize with." That's really what "software with thousands of users" boils down to. If programmers who can take the lead when software is still small and approachable, and non-programmers coalesce around their forks rather than upstream, we might slowly evolve towards the hazy societal organization I'm vaguely pointing in the direction of.

But an essential component of this plan is for non-programmers to articulate early and often their desire to migrate away from the current monopoly they are forced to use.

ccorxi|2 years ago

Of course we non-programmers want to move away from big corps environments. Here's my 50 cents of what would be ideal for me...modular software, easy to assemble, no code. And if a module is not available, I'd be happy to pay a (reasonable) amount to get it done. All this open source.

lambdaxymox|2 years ago

For sure among the hardest problems in software engineering are versioning dependencies, and managing dependencies. At least those are the two I find the most aggravating. It seems like almost nobody can get it right even though component-based software engineering, SoA, etc. I think are generally extremely good ideas. The execution is pretty crummy pretty much everywhere.

With all that said, my sense is that hardware engineering has its own heap of Sisyphean problems and complexities. I definitely would not go back to working on hardware engineering problems like I did super early in my career (a mix of embedded firmware, device drivers, PCB design, and web development). I shudder at the thought of ever working with anything Verilog/VHDL, Xilinx, or SPICE ever again, or debugging PCB designs on the bench top in the lab with an oscilloscope and a logic probe. At least in school I ran more than a few bodge wires to patch a mistake in a PCB design iteration. Maybe in some sense, it's a blessing that those linear systems theory abstractions fall apart utterly in RF engineering problems, and one has to contend with the fact that all circuits radiate. At least circuits that still contain the magic smoke.

Aerbil313|2 years ago

I believe nix is the logical solution to dependency management and thus is the future of it.

microtonal|2 years ago

Imagine for a moment doing AI-work

In many ways it's still the same. Transformers use matrix multiplication is their main operation, the underlying matrix multiplication libraries have mostly seen incremental performance improvements over the last two decades or so. Most other ops in eg. core PyTorch are implemented using C++ templates and are mostly familiar to a 2008 C++ programmer. Most of my work is largely C++/Python/Cython as it has been the last 1-2 decades. Sure, the machine learning models have changed, but those are relatively easy to pick up.

EVa5I7bHFq9mnYK|2 years ago

But most software is not 3d movie rendering or AI. Look at the AppStore. 99% of these apps could have been written with 1970's Pascal. 1996s ICQ had 90% of functionality of modern messengers. People just love new things.