top | item 33832448

Ask HN: Do you feel bad when devices aren't utilized to the extreme?

71 points| behnamoh | 3 years ago | reply

e.g., I feel bad when people buy M1 (or M2) MacBook Pros just to check emails, browse the internet, and do presentations (very common among managers).

Or when an old iPhone or Android is sitting in a cabinet getting dust, while it can be used as a webcam, small home server, automation device, clock (I use an old iPad), etc.

I think so much compute power is being wasted and I'm not sure how to feel about that.

126 comments

order
[+] dmitrybrant|3 years ago|reply
Yes, definitely. I feel a profound sense of despair when we purchase a mobile device with the most cutting-edge CPU and insane amount of memory, and proceed to saddle it with a bloated VM that's designed to guardrail and protect the device from even more bloated, leaky, crashy, unresponsive apps.

And when we purchase a new desktop workstation, equipped with even more staggering horsepower, we're totally OK with installing software that makes itself comfortable by consuming large fractions of this horsepower without any apparent need, all because the amazing hardware is able to mask the awfulness of the software.

[+] SQueeeeeL|3 years ago|reply
Yeah, I feel an immense loss of "art" in modern software. Very specific example, but looking back at the sheer beauty of innovation to program the original Pokemon on a Gameboy, with it's complex compression schemes and systems to render the menus and battles. Compared to today where they release a $60 "next-gen" Pokemon game with massive performance issues and huge graphical bugs. It's just kinda sad to see software become so industrialized and commodified.
[+] Canada|3 years ago|reply
> and proceed to saddle it with a bloated VM that's designed to guardrail and protect the device from even more bloated, leaky, crashy, unresponsive app

Blaaah. Ship it and enjoy the moment!!

[+] mk_stjames|3 years ago|reply
It would only be considered 'wasted' if energy was free. But energy is not free, and thus choosing how to use compute resources is still important.

e.g. I own a few high powered workstations for fluid dynamics computations. If energy was free, I would be just running 'what-if' solves all the time just for the hell of it. But because energy isn't free, they stay in sleep unless I need them for what is deemed 'necessary' for research.

[+] nrp|3 years ago|reply
For most consumer electronics devices, the energy that goes into manufacturing them vastly exceeds the total energy used during their lifetimes. In that sense (ignoring differences in emissions from different energy sources), from an environmental perspective it’s better to have fewer products running at a higher duty cycle (may not be applicable to high-performance workstations).
[+] dirheist|3 years ago|reply
That's fair, but I have a half a dozen servers sitting in a colo with networking/energy/cooling included in the lease agreement so energy is not a concern.

I usually just run game servers with the extra RAM/cpu threads for my friends. If my elasticsearch cluster is particularly unused some days and I have extra bandwidth I might turn on wireguard and let the 450GB disk of torrents I have on disk seed. Anything that makes the btop graph look active and lively makes me feel nice.

[+] techsupporter|3 years ago|reply
No, because always running at maximum capacity is an indication that something has no room for failure, or for growth.

I have a M2 MacBook Air with the most amount of RAM and storage that Apple would sell me. I'm sure its CPU sits at about 5% utilization almost the entire day. This was intentional on my part. I bought a laptop with enough "extra" capacity that it will be able to handle increased inefficiencies, bloat, and new features from ever-expanding software for the next several years.

So by buying one "over-speced" laptop now, I avoid buying two or three of them in the future. This is how I like to buy devices, especially since some items, like the screen or keyboard or the like, are mostly fungible.

[+] swagasaurus-rex|3 years ago|reply
I have a 2014 macbook air that I got the best specs for at the time (8gb!).

It still works great, battery life isn't great but I've had no issues in the 9 years I've had it.

[+] gompertz|3 years ago|reply
Same reason I bought my Lamborghini... For that occasional time I need to get to the store real quick. But really it is mostly driven 30mph into potholes.
[+] TylerE|3 years ago|reply
Plus performance on most laptops is N illusion anyway. Actually try to use it all and you’ll just thermal throttle.
[+] 2OEH8eoCRo0|3 years ago|reply
Just out of curiosity- which computer did you replace with the M2 Air?
[+] PaulHoule|3 years ago|reply
I would turn it the other way around.

So many underpowered devices are produced that sit in drawers, collect dust, and end up in the landfill. For instance Intel makes "atom" devices that if they don't die early because of design flaws, they die early because they aren't really useful. There was a time when Intel was interested in having you buy a new computer because it was the best computer you ever had, now they are interested in putting every vendor of parts out of business (except seemingly Synaptics) so they can get more of the BOM, even if this means the new computer you buy will be the worst computer you ever bought.

If there was some minimum standard of quality for devices, people wouldn't need to buy so many.

[+] drewzero1|3 years ago|reply
While I don't have experience with the specific devices you mention, I have seen a lot of crap laptops get replaced quickly by other crap laptops. I don't mind cheap low-powered computers (they have their use cases) but it really frustrates me to see them marketed as general-purpose computers to average consumers. Intel and AMD have both been guilty of selling low-powered, low-cost "general purpose" CPUs that couldn't compute their way out of a wet paper bag.

Software is to blame for this problem as well. While we have gained a lot of features, modern software often uses system resources in a way that's not helpful to the user (either by active things like telemetry or lazy things like iterating over every object in a database unnecessarily). I get the feeling that software providers take my compute power for granted, and if I don't have enough, it's just time for me to buy a new computer again.

[+] KronisLV|3 years ago|reply
> For instance Intel makes "atom" devices that if they don't die early because of design flaws, they die early because they aren't really useful.

Setting aside the reliability concerns (see other comments) for a bit...

I think a number of years ago Scaleway (or someone else) offered VPSes with Intel Atom CPUs and had some ARM offerings as well. This discussion appears to be the best I can find at the moment: https://lowendtalk.com/discussion/77819/new-scalaway-c2-inte...

For my needs, really cheap VPS servers are essentially ideal since most of my workloads are RAM bottlenecked and can easily deal with a single core CPU. This would also be perfect for hobby projects. Why live on free tiers that have lots of limitations when you can just get new nodes per project (or group of projects) for 1-3 Euros a month?

Actually, my current homelab servers (that I also use as CI nodes) run AMD GE200 CPUs with a TDP of 35W. If I could find something that's AM4 compatible, readily available and cheap, you can be pretty sure I'd go for it!

Oh, and my notebook runs I think one of the Celerons, that have a <10 W TDP and that's still good enough for browsing the web, chatting, writing blog posts and so on, even some light development work. And the whole notebook cost around 200 Euros when I got it.

Just look at how popular Raspberry Pi is as a platform for homelabs, now imagine doing the same with the x86 platform and not some overpriced novelty hardware that comes in small runs, but something to rival the mass production of Pi.

I want to live in that world.

If we weren't so heavily impacted by Wirth's law (https://en.wikipedia.org/wiki/Wirth%27s_law) both in regards to our OSes (just try running anything heavier than XFCE4 on any of those pieces of hardware) and platforms (e.g. comparing the footprint of Go vs something like Ruby or Java; though each have their uses; though there's also the facet of lots of browser-based software nowadays, e.g. Visual Studio Code vs a native text editor like CudaText), far fewer people would consider hardware like that "underpowered" or e-waste.

[+] nicolaslem|3 years ago|reply
Are you talking about the Atom line of Intel CPUs? If so, I don't really understand this comment.

Atoms are used in all kind of applications where low(ish) power x86 is required. They even support ECC RAM making them great fit for NAS and embedded servers deployed in the field.

They may not shine as a general purpose CPU but they are definitely not ewaste.

[+] thoughtFrame|3 years ago|reply
What bothers me is that I don't know how to squeeze all of the juice out of my CPU (or any programmable device for that matter). I see instances of the Jevons effect everywhere (e.g. the use of Electron), and on the other side I see DOS-era programmers whose experience transfer really well to more modern machines, so they know how to get the best throughput on their data, how to isolate slow but necessary parts of their programs (like certain OS APIs). I feel like having started programming on a high level language affected the way I program even in lower level languages, so I don't know how to do better resource utilization, or how to do I/O efficiently.

So I don't have a problem with, say, writing a script in Python even though it's not the most efficient use of my CPU, because I'm just looking to get things done. But when I have a problem that needs horsepower (and I know my 8 core 3.6 GHz can absolutely deliver that), I don't know how to tell it to do it. It doesn't help that many programmers' first thought would be to go for the cloud, when a single computer can be much faster than a bunch of AWS VMs

Another example is the .kkrieger demo, which seems like wizardry when you've only seen similar things done in Unity/Unreal

[+] itake|3 years ago|reply
My issue is concurrency is so hard. One time tasks don't make sense to make concurrent, b/c migrating from sync to async takes 2-5x longer + higher bug risks and more complexity.

So most tasks remain inefficient...

[+] Netcob|3 years ago|reply
The compute power thing is becoming a pet peeve of mine.

Consuming too much power is already a big ecological issue, and here in Europe it's becoming a financial problem too. I still want my computer to perform as good as possible, be it running a game or compiling some code or transcoding videos or whatever. In that moment. But the rest of the time, it should consume as little power as possible. I've been looking into ways to upgrade, and most of the reviews focus 95% of their time/space on peak performance (basically torture tests) and peak power consumption (but more from a "what PSU/cooler do you need" perspective). Office-type work, which is 95% of what I actually use it for, is always an afterthought.

Back to your original question, I do understand where you're coming from. I'd love to use an old android phone as a webcam, it's just a shame that most don't work without a battery, and having a device like that permanently in a charging state will eventually lead to it ballooning.

[+] te_platt|3 years ago|reply
Yes, very much. Growing up I used to scramble to get whatever computer equipment I could. My parents didn't have much money but many of my friends parents did. I hated asking to borrow stuff even though people didn't seem to mind. Now when I see my extra equipment not get used I imagine some kid really wanting to just use it for a bit. Maybe just my own mental issues here but I know what you mean.
[+] LinuxBender|3 years ago|reply
Do you feel bad when devices aren't utilized to the extreme?

No. My devices are rarely utilizing over 40% CPU and tend to last for a very long time. I know it isn't much but this helps keep some gear out of the landfill. I can always find a use for older hardware. This Linux PC I am writing this post on is coming up on 12 years old and is more than adequate for my purposes. I expect it to last at least another 10 years, though I may have to replace the spinning rust with an SSD at some point.

I recently bought my first smart phone despite not really needing one. I have mixed thoughts on how long this will last since I do not control the firmware yet. These devices are designed to be difficult to service. I will probably turn it into a glorified MP3 player since it has a large battery and just get the Caterpillar flip phone. I think the smart phone would be a great home audio entertainment system. I can not find a logical reason to push it to the extreme.

[+] jcelerier|3 years ago|reply
> This Linux PC I am writing this post on is coming up on 12 years old and is more than adequate for my purposes

it is very very likely wasting a ton of watts doing what a secondhand stick PC such as https://www.amazon.ca/dp/B08G1CCWN5 can do in less than 10 watts, so no, it is very much not adequate from a global point of view

[+] thfuran|3 years ago|reply
You're still booting off HDD? That's pretty wild.
[+] valar_m|3 years ago|reply
Reckless and wholly unqualified armchair psychologist here. Seems like this isn't really about wasted compute power at all and more about feeling like you aren't working on something that matters.
[+] prettyStandard|3 years ago|reply
As someone who this also bothers. That might be right.
[+] JohnFen|3 years ago|reply
I feel the same!

My friends even tease me because in my hobby microcontroller-based electronics projects, I'll put the wimpiest microcontroller that will do the job instead of just slapping an R-Pi or Arduino board in them. I defend myself on the basis on minimizing power requirements, but really, I just can't bear to see capabilities go to waste.

[+] dbrueck|3 years ago|reply
Yes. A similar scenario is when I use a cheap microcontroller in some one-off project that uses only one or two GPIOs. I'll use an esp32 because I've got a bunch lying around and it's only a few bucks, but the fact that it's loaded to the hilt with unused sensors & I/Os leaves me feeling guilty for some reason.
[+] digitalsushi|3 years ago|reply
no i really dont

it doesnt make me feel bad when my boiler isnt burning diesel as fast as it can

or that my car is literally not even running right now

or that i am not cramming for certifications and new languages

or that my siblings are not working out or at college

or that my laptop is waiting for instructions from me instead of me waiting for it

[+] dataflow|3 years ago|reply
I feel horrible that supposedly environmentally-conscious folks force perfectly good hardware to become obsolete through software (and often in the name of "security"), if that's what you mean. The whole "trade in your 1- or 2-year-old phone for a new one" thing is just absolutely insane to me.

But once the hardware is acquired, your goal should be to stretch its lifetime and spend the least energy on it, not the most.

[+] madsbuch|3 years ago|reply
To be honest an M1 (M1 Max, 64GB RAM) performs quite poor doing serious software development.

As an example, it takes 22 sec to run a test suite on the M1, in our CI the same test suite takes 9 sec.

These might be due to software that has not been written yet.

To be honest, the computer seems like a high end office computer, so I think that use case seems spot on.

[+] smoldesu|3 years ago|reply
IMO the bigger problem is that MacOS is poorly equipped to handle modern development workloads. POSIX-certification is near-meaningless in 2022 and essentially hides a rotting UNIX ecosystem at the heart of MacOS. Unless you modify the OS (add a package manager, force recent coreutils, eschew Xcode, etc.) Macs will feel like the black sheep of your infrastructure.

A good 80% of my Mac gripes could be solved by running Linux full-time. Unfortunately the other 20% are ARM-related issues, so I'm sorta stuck between a rock and a hard place.

[+] tharkun__|3 years ago|reply
You say "a" test suite as if this was universal truth. I can tell you that it is not and your suspicion is correct. It depends. Our own test suites saw 25-50% performance increases / decrease in runtime on our developer machines when we switched to M1s for example.
[+] adhoc_slime|3 years ago|reply
well honestly that's just wasted compute becuase of the design limitation of the M1 and its cooling. It runs full tilt to a thermal limit and then stays below it, thermally throttling the CPU. Makes me want to see a benchmark comparison with some custom cooling solution on an m1 vs stock. The whole laptop form-factor for these cpu's feels so limiting to their potential.
[+] dijit|3 years ago|reply
Is your CI running on docker for mac, perchance?
[+] nyadesu|3 years ago|reply
I don't think that's a fair comparison, machines exclusively running those CI pipelines could be very powerful
[+] ahelwer|3 years ago|reply
There is such a staggering quantity of electronics (or just stuff, generally) wasted that if you're putting something to literally any use at all - like your ipad clock - then you're miles better than standard.

I also encourage everyone to be aggressive in putting things on the used market, or even giving it away for free (commonly in those local facebook "buy nothing" groups). For a lot of mass-produced things you can view the used market as a storage space. Sell/give it away one year, and if you need it the next year you'll probably be able to find someone else doing the same. On the flip side, avoid buying new stuff. Try to only buy used electronics, especially.

[+] fifanut|3 years ago|reply
My oven spends most of its time off, and when I use it, it's heats only to 70% of its potential.

My hair clippers are unused 99.9% of the time.

There are benefits to having the ability to opportunistically burst into 100%, and some benefits aren't easily measurable in performance terms (having an up to date secure MacBook).

We can find wasted potential in various places:

* the millions of people receiving poor education

* people working in jobs below their potential skillset

* galaxies with vast idle resources

* people spending time on logistics/bureaucracy

There's a world of opportunity out there for improvement.

[+] Barrin92|3 years ago|reply
Not to the extreme, but what I don't like is when you see people spending thousands on hardware and all they do is watch youtube, or rich folks who can't cook with the most expensive handcrafted set of knives or buying their kid that plays once a month a great guitar. There's just something aesthetically wrong with seeing a great tool not being utilized.

On the other hand I'm not super frugal to the point where I'm on a eight year old phone to never waste a cpu cycle or something.

[+] bravetraveler|3 years ago|reply
I used to concern myself with it, but then I realized I'm really bad at tracking time/utility

For example, I've always had 'more computer' than I truly need. It hasn't turned out to be a bad thing though, because sometimes usage does surge. Maybe the situation calls for being able to build that thing really ridiculously quickly. Whatever.

I think 'opportunity cost' applies, and attempts on that calculus are beyond me

[+] maximus-decimus|3 years ago|reply
I'm sure software being slower than molasses wastes much more computer resources than unused outdated phones.

Plus, if computing power isn't strictly necessary to achieve a task, but makes the experience smoother, is it really wasted? Nobody wants their web browser to look like a power point (which it kinda does on my pinebook pro, a $200 arm laptop). Most people could daily it, but stuff running smoother is worth something.

[+] knaik94|3 years ago|reply
I don't feel bad because I enjoy using technology myself. I realized a long time ago that the bigger waste of energy is caring about how other people use their own devices. For people my age, I can safely say that the Apple Mac vs Windows PC ads where they showed Apple being "better" is a big reason people care more than they should.

After gifting technology to some family members, I realized that what matters most at the end of the day is they enjoy using technology stress free. You could get away with non M1 MacBook Pros for presentations, but if using an M1 makes you feel good then you should use it.

If you're worried about e-waste, then pass down old technology. My sibling got a phone a lot younger than her peers because I love technology and when I switched to a new model, I passed it down. She didn't care about the specs, like 99% of the population she doesn't think about compute power.

Compute power isn't a limited resource in the world. If someone has a need for compute power, they will probably find a way to use old technology on their own. That's not something you should feel bad or worry about. I don't.