(no title)
vctrnk | 2 years ago
As someone who could've developed an IT/programming career, but didn't because I felt things were already bloating back in the '00s, I agree with the majority: "harvesting your own food" can be rewarding but also a tedious and thankless job. It's certainly not for everyone, but if it works for some people then it is (let's put efficiency aside for a moment) perfectly valid. In fact, being more of a H/W guy I find myself gravitating towards this approach more often than not. Leanness and reproducibility is key for my workflow (I went the RF-world path), I can't afford different end results when a dependency changes/breaks something.
IMHO, keeping up with the modern paradigms for S/W development looks like a never-ending nightmare. Yes it's the modern way, yeah it's the state of the art. Still, I didn't feel it was a wise investment of my time to learn all those "modern dev" ropes, and I still feel that 20 years later. I'm nowhere near antiquated and I'm on top of all things tech (wouldn't read HN otherwise), it's just...
I see former friends/classmates that went this way, and they're in a constant cat-and-mouse game where 50% of time they're learning/setting up something dev-chain related, the rest 50% doing actual work, and 98% of it feeling way too stressed. I see modern Android devices with their multi-MB apps, bloated to hell and beyond for a simple UI that takes ages to open on multi-core, multi-GHZ SOCs. I see people advocating unused RAM is wasted RAM, never satisfied until every byte is put to good use, reluctant to admit that said good use is just leaving the machine there "ready" to do something, but not doing anything _productive_ actually.
And yet.
Without that bloat, without the convienience of pre-made libraries and assist tools for almost every function one could desire, we wouldn't be where we are now. Imagine for a moment doing AI-work, 3D movie rendering, data science etc. with a DBless approach on single-core machines with every resource micro-managed to eke out the most performance. It's simply not feasible, we would still be on the 90s... just a bit more hipster.
This article resonates so well with me. And at the same time, it feels so distant.
akkartik|2 years ago
But an essential component of this plan is for non-programmers to articulate early and often their desire to migrate away from the current monopoly they are forced to use.
ccorxi|2 years ago
lambdaxymox|2 years ago
With all that said, my sense is that hardware engineering has its own heap of Sisyphean problems and complexities. I definitely would not go back to working on hardware engineering problems like I did super early in my career (a mix of embedded firmware, device drivers, PCB design, and web development). I shudder at the thought of ever working with anything Verilog/VHDL, Xilinx, or SPICE ever again, or debugging PCB designs on the bench top in the lab with an oscilloscope and a logic probe. At least in school I ran more than a few bodge wires to patch a mistake in a PCB design iteration. Maybe in some sense, it's a blessing that those linear systems theory abstractions fall apart utterly in RF engineering problems, and one has to contend with the fact that all circuits radiate. At least circuits that still contain the magic smoke.
Aerbil313|2 years ago
microtonal|2 years ago
In many ways it's still the same. Transformers use matrix multiplication is their main operation, the underlying matrix multiplication libraries have mostly seen incremental performance improvements over the last two decades or so. Most other ops in eg. core PyTorch are implemented using C++ templates and are mostly familiar to a 2008 C++ programmer. Most of my work is largely C++/Python/Cython as it has been the last 1-2 decades. Sure, the machine learning models have changed, but those are relatively easy to pick up.
EVa5I7bHFq9mnYK|2 years ago