At my former job at a FAANG, I did the math on allocating developers machines with 16GB vs 64GB based on actual job tasks with estimates of how much thumb twiddling waiting time that this would save and then multiplied that out by the cost of the developer's time. The cost benefit showed a reasonable ROI that was realized in Weeks for Senior dev salaries (months for juniors).
Based on this, I strongly believe that if you're providing hardware for software engineers, it rarely if ever makes sense to buy anything but the top spec Macbook Pro available, and to upgrade every 2-3 years. I can't comment on non desktop / non-mac scenarios or other job families. YMMV.
No doubt the math checks out, but I wonder if developer productivity can be quantified that easily. I believe there's a lot of research pointing to people having a somewhat fixed amount of cognitive capacity available per day, and that aligns well with my personal experience. A lot of times, waiting for the computer to finish feels like a micro-break that saves up energy for my next deep thought process.
Most of my friends at FAANG all do their work on servers remotely. Remote edit, remote build. The builds happen in giant networked cloud builders, 100s to 1000s per build. Giving them a faster local machine would do almost nothing because they don't do anything local.
When I worked at a FAANG, most developers could get remote virtual machine for their development needs. They could pick the machine type and size. It was one of the first thing you'd learn how to do in your emb^H^H^H onboarding :)
So it wasn't uncommon to see people with a measly old 13" macbook pro doing the hard work on a 64cpu/256GB remote machine. Laptops were essentially machines used for reading/writing emails, writing documents and doing meetings. The IDEs had a proprietary extensions to work with remote machines and the custom tooling.
> it rarely if ever makes sense to buy anything but the top spec Macbook Pro available
God I wish my employers would stop buying me Macbook Pros and let me work on a proper Linux desktop. I'm sick of shitty thermally throttled slow-ass phone chips on serious work machines.
Gave developers 16GB RAM and 512MB storage. Spent way too much time worrying about available disk space and needlessly redownloading docker images off the web.
But at least they saved money on hardware expenses!
FAANG manages the machines. Setting aside the ethics of this level of monitoring, I'd be curious to validate this by soft-limiting OS memory usage and tracking metrics like number of PRs and time someone is actively on the keyboard.
My personal experience using virtual desktops vs a MacBook aligns with your analysis. This despite the desktop virtual machines having better network connections. A VM with 16 GB of memory and 8 VCPUs can't compete with an M1 Max laptop.
To put a massive spanner in this, companies are going to be rolling out seemingly mandatory AI usage, which has huge compute requirements .. which are often fulfilled remotely. And has varying, possibly negative, effects on productivity.
I think those working on user-facing apps could do well having a slow computer or phone, just so they can get a sense of what the actual user experience is like.
No doubt you mean well. In some cases it’s obvious- low memory machine can’t handle some docket setup, etc.
In reality, you can’t even predict time to project completion accurately. Rarely is a fast computer a “time saver”.
Either it’s a binary “can this run that” or a work environment thing “will the dev get frustrated knowing he has to wait an extra 10 minutes a day when a measly $1k would make this go away”
This article skips a few important steps - how a faster CPU will have a demonstrable improvement on developer performance.
I would agree with the idea that faster compile times can have a significant improvement in performance. 30s is long enough for a developer to get distracted and go off and check their email, look at social media, etc. Basically turning 30s into 3s can keep a developer in flow.
The critical thing we’re missing here is how increasing the CPU speed will decrease the compile time. What if the compiler is IO bound? Or memory bound? Removing one bottleneck will get you to the next bottleneck, not necessarily get you all the performance gains you want
I still run a 6600 (65W peak) from 2016 as my daily driver. I have replaced the SSD once (MLC lasted 5 years, hopefully forever with SLC drive from 2011?), 2x 32GB DDR4 sticks (Kingston Micron lasted 8 years, with aliexpress "samsung" sticks for $50 a pop) and Monitor (Eizo FlexScan 1932 lasted 15! years RIP with Eizo RadiForce 191M, highly recommend with f.lux/redshift for exceptional quality of image without blue light)
It's still powerful enough to play any games released this year I throw at it at 60 FPS (with a low profile 3050 from 2024) let alone compile any bloat.
Keep your old CPU until it breaks, completely... or actually until the motherboard breaks; I have a Kaby Lake 35W replacement waititng for the 6600 to die.
There are peaks in long-term CPU value. That is, CPUs that are 1) performant enough to handle general purpose computing for a decade and 2) outperform later chips for a long time.
The i7-4770 was one. It reliably outperformed later Intel CPUs until near 10th gen or so. I know shops that are still plugging away on them. The first comparable replacements for it is the i7-12700 (but the i5-12400 is a good buy).
At 13th gen, Intel swaps E for P cores. They have their place but I still prefer 12th gen for new desktops.
Past all that, the author is right about the AMD Ryzen 9950x. It's a phenomenal chip. I used one in a friend's custom build (biz, local llm) and it'll be in use in 2035.
> The i7-4770 was one. It reliably outperformed later Intel CPUs until near 10th gen or so.
Per which benchmarks?
> At 13th gen, Intel swaps E for P cores.
One nit, Intel started adding (not swapping) E-cores to desktop parts with 12th gen, but i3 parts and most i5 parts were spared. More desktop i5 parts got them as starting with 13th gen.
What's wrong with E cores? They're the best bang for the buck for both baseline low-power usage (and real-world systems are idle a lot of the time) and heavy multicore workloads. An E-core cluster takes a tiny fraction of area and power compared to a P-core, so it's not just a one-on-one comparison.
Important caveat that the author neglects to mention since they are discussing laptop CPUs in the same breath:
The limiting factor on high-end laptops is their thermal envelope. Get the better CPU as long as it is more power efficient. Then get brands that design proper thermal solutions.
You simply cannot cram enough cooling and power into a laptop to have it equal a desktop high end desktop CPU of the same generation. There is physically not enough room. Just about the only way to even approach that would be to have liquid cooling loop ports out the back that you had to plug into an under-desk cooling loop and I don't think anyone is doing that because at that point just get a frickin desktop computer + all the other conveniences that come with it (discrete peripherals, multiple monitors, et cetera). I honestly do not understand why so many devs seem to insist on doing work on a laptop. My best guess is this is mostly the apple crowd because apple "desktops" are for the most part - just the same hardware in a larger box instead of being actually a different class of machine. A little better on the thermals, but not the drastic jump you see between laptops and desktops from AMD and Intel.
Employers, even the rich FANG types, are quite penny-wise and pound-foolish when it comes to developer hardware.
Limiting the number and size of monitors. Putting speedbumps (like assessments or doctor's notes) on ergo accessories. Requiring special approval for powerful hardware. Requiring special approval for travel, and setting hotel and airfare caps that haven't been adjusted for inflation.
To be fair, I know plenty of people that would order the highest spec MacBook just to do web development and open 500 chrome tabs. There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs, which is just a small fraction of one year's salary for a developer.
Every well funded startup I’ve worked for went through a period where employees could get nearly anything they asked for: New computers, more monitors, special chairs, standing desks, SaaS software, DoorDash when working late. If engineers said they needed it, they got it.
Then some period of time later they start looking at spending in detail and can’t believe how much is being spent by the 25% or so who abuse the possibly. Then the controls come.
> There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs,
You would think, but in the age of $6,000 fully specced MacBook Pros, $2,000 monitors, $3,000 standing desks, $1500 iPads with $100 Apple pencils and $300 keyboard cases, $1,000 chairs, SaaS licenses that add up, and (if allowed) food delivery services for “special circumstances” that turns into a regular occurrence it was common to see individuals incurring expenses in the tens of thousands range. It’s hard to believe if you’re a person who moderates their own expenditures.
Some people see a company policy as something meant to be exploited until a hidden limit is reached.
There also starts to be some soft fraud at scales higher than you’d imagine: When someone could get a new laptop without questions, old ones started “getting stolen” at a much higher rate. When we offered food delivery for staying late, a lot of people started staying just late enough for the food delivery to arrive while scrolling on their phones and then walking out the door with their meal.
I know a FAANG company whose IT department, for the last few years, has been "out of stock" for SSD drives over 250GB . They claim its a global market issue (it's not). There's constant complaining in the chats for folks who compile locally. The engineers make $300k+ so they just buy a second SSD from Amazon on their credit cards and self-install them without mentioning it to the IT dept. I've never heard a rational explanation for the "shortage" other than chronic incompetence from the team supplying engineers with laptops/desktops. Meanwhile, spinning up a 100TB cloud VM has no friction whatsoever there. It's a cushy place to work tho, so folks just accept the comically dumb aspects everyone knows about.
I think you're maybe underestimating the aggregate cost of totally unconstrained hardware/travel spending across tens or hundreds of thousands of employees, and overestimating the benefits. There need to be some limits or speedbumps to spending, or a handful of careless employees will spend the moon.
> But that abuse is really capped out at a few thousand
That abuse easily goes into the tens of thousands of dollars, even several hundred thousand, even at a relatively small shop. I just took a quick look at Apple's store, and wow! The most expensive 14" MacBook Pro I could configure (minus extra software) tops out at a little over $7,000! The cheapest is at $1,600, and a more reasonably-specced, mid-range machine (that is probably perfectly sufficient for dev work), can be had for $2,600.
Let's even round that up to $3,000. That's $4,000 less than the high end. Even just one crazy-specced laptop purchase would max out your "capped out at a few thousand" figure.
And we're maybe not even talking about abuse all the time. An employee might fully earnestly believe that they will be significantly more productive with a spec list that costs $4,000, when in reality that $3,000 will be more or less identical for them.
Multiply these individual choices out to a 20 or 40 or 60 person team, and that's real money, especially for a small startup. And we haven't even started talking about monitors and fancy ergonomic chairs and stuff. 60 people spending on average $2,000 each more than they truly need to spend will cost $120k. (And I've worked at a place that didn't eliminate their "buy whatever you think you'll need" policies until they had more than 150 employees!)
Just to do web development? I regularly go into swap running everything I need on my laptop. Ideally I'd have VScode, webpack, and jest running continuously. I'd also occasionally need playwright. That's all before I open a chrome tab.
Always amuses me when I see someone use web development as an example like this. Web dev is very easily in the realm of game dev as far as required specs for your machine, otherwise you're probably not doing much actual web dev. If anything, engineers doing nothing but running little Java or Python servers don't need anything more than a PI and a two-color external display to do their job.
FANG is not monolithic. Amazon is famously cheap. So is Apple in my opinion based on what I have heard (you get random refurbished hardware that is available not some standardized thing, sometimes with 8GB RAM sometimes something nicer) Apple is also famously cheap on their compensation. Back in the day they proudly said shit to the effect of "we deliberately don't pay you top of the market because you have to love Apple" to which the only valid answer is "go fuck yourself."
Google and Facebook I don't think are cheap for developers. I can speak firsthand for my past Google experience. You have to note that the company has like 200k employees and there needs to be some controls and not all of the company are engineers.
Hardware -> for the vast majority of stuff, you can build with blaze (think bazel) on a build cluster and cache, so local CPU is not as important. Nevertheless, you can easily order other stuff should you need to. Sure, if you go beyond the standard issue, your cost center will be charged and your manager gets an email. I don't think any decent manager would block you. If they do, change teams. Some powerful hardware that needs approval is blanket whitelisted for certain orgs that recognize such need.
Trips -> Google has this interesting model you have a soft cap for trips and if you don't hit the cap, you pocket half of the trips credit in your account which you can choose to spend later when you are overcap or you want to get something slightly nicer the next time. Also, they have clear and sane policies on mixing personal and corporate travel. I encourage everyone to learn about and deploy things like that in their companies. The caps are usually not unreasonable, but if you do hit them, it is again an email to your management chain, not some big deal. Never seen it blocked. If your request is reasonable and your manager is shrugging about this stuff, that should reflect on them being cheap not the company policy.
What would be a good incentivizing strategy to prevent over spending on hardware? I can think of giving a budget and the amount not spend is payed out to them (but when the salary is that high it might not make sense) or like having a internal dashboard where everybody can see every body’s spending on hardware, so people feel bad when they order to much.
It's straightforward to measure this; start a stopwatch every time your flow gets interrupted by waiting for compilation or your laptop is swapping to keep the IDE and browser running, and stop it once you reach flow state again.
We managed to just estimate the lost time and management (in a small startup) was happy to give the most affected developers (about 1/3) 48GB or 64GB MacBooks instead of the default 16GB.
At $100/hr minimum (assuming lost work doesn't block anyone else) it doesn't take long for the upgrades to pay off. The most affected devs were waiting an hour a day sometimes.
This applies to CI/CD pipelines too; it's almost always worth increasing worker CPU/RAM while the reduction in time is scaling anywhere close to linearly, especially because most workers are charged by the minute anyway.
I think you wanted to say "especially". You're exchanging clearly measurable amounts of money for something extremely nebulous like "developer productivity". As long as the person responsible for spend has a clear line of view on what devs report, buying hardware is (relatively) easy to justify.
Once the hardware comes out of a completely different cost center - a 1% savings for that cost center is promotion-worthy, and you'll never be able to measure a 1% productivity drop in devs. It'll look like free money.
With compiler development work, a low end machine will do just fine, as long as it has a LARGE monitor. (Mine is 3840x2160, and I bought a satellite monitor to extend it.)
P.S. you can buy a satellite monitor often for $10 from the thrift store. The one I bought was $10.
I don't buy used keyboards because they are dirty and impossible to clean.
> highest spec MacBook just to do web development and open 500 chrome tabs. There is abuse.
Why is that abuse? Having many open browser tabs is perfectly legitimate.
Arguably they should switch from Chrome to Safari / lobby Google to care about client-side resource use, but getting as much RAM as possible also seems fine.
Yes, just went from i3770 (12 years old!) to a 9900x as I tend to wait for a doubling of single core performance before upgrading (got through a lot of PCs in the 386/486 era!). It's actually only 50% faster according to cpubenchmark [0] but is twice as fast in local usage (multithread is reported about 3 times faster).
Also got a Mac Mini M4 recently and that thing feels slow in comparison to both these systems - likely more of a UI/software thing (only use M4 for xcode) than being down to raw CPU performance.
I jumped ahead about 5 generations of Intel, when I got my new laptop and while the performance wasn't much better, the fact that I changed from a 10 pound workstation beast that sounded like a vacuum cleaner, to a svelte 13 inch laptop that works with a tiny USB C brick, and barely runs its fans while being just as fast made it worthwhile for me.
When ever I've built a new desktop I've always gone near the top performance with some consideration given to cache and power consumption (remember when peeps cared about that? lol).
From dual pentium pros to my current desktop - Xeon E3-1245 v3 @ 3.40GHz built with 32 GB top end ram in late 2012 which has only recently started to feel a little pokey, I think largely due to cpu security mitigations added to Windows over the years.
So that extra few hundred up front gets me many years extra on the backend.
Tangential: TIL you can compile the Linux kernel in < 1 minute (on top-spec hardware). Seems it’s been a while since I’ve done that, because I remember it being more like an hour or more.
This compares a new desktop CPU to older laptop ones. There are much more complete benchmarks on more specialized websites [0, 1].
> If you can justify an AI coding subscription, you can justify buying the best tool for the job.
I personally can justify neither, but not seeing how one translates into another: is a faster CPU supposed to replace such a subscription? I thought those are more about large and closed models, and that GPUs would be more cost-effective as such a replacement anyway. And if it is not, it is quite a stretch to assume that all those who sufficiently benefit from a subscription would benefit at least as much from a faster CPU.
Besides, usually it is not simply "a faster CPU": sockets and chipsets keep changing, so that would also be a new motherboard, new CPU cooler, likely new memory, which is basically a new computer.
I wish developers, and I'm saying this as one myself, were forced to work on a much slower machine, to flush out those who can't write efficient code. Software bloat has already gotten worse by at least an order of magnitude in the past decade.
I spent a few grand building a new machine with a 24-core CPU. And, while my gcc Docker builds are MUCH faster, the core Angular app still builds a few seconds slower than on my years old MacBook Pro. Even with all of my libraries split into atoms, built with Turbo, and other optimizations.
6-10 seconds to see a CSS change make its way from the editor to the browser is excruciating after a few hours, days, weeks, months, and years.
I generally agree you should buy fast machines, but the difference between my 5950x (bought in mid 2021. I checked) and the latest 9950x is not particularly large on synthetic benchmarks, and the real world difference for a software developer who is often IO bound in their workflow is going to be negligible
If you have a bad machine get a good machine, but you’re not going to get a significant uplift going from a good machine that’s a few years old to the latest shiny
Something I find weird is that this article compares a 9950x with two different laptop CPUs and concludes that performance has increased massively in the past few years. If you compare the 9950x with its two Desktop predecessors (released 2 and 4 years before), you see about a 6% increase from the 7950x and a 45% increase from the 5950x. So you should consider upgrading regularly, but potentially not every single generation. I think it makes sense to consider the performance and offer an upgrade when you see a 50% or so cumulative improvement. Everywhere I have worked has upgraded developers every 3-4 years, and it might make sense to upgrade if there is a massive change (like when Macbooks went to M-series).
As for Desktop vs Laptop, that is relevant too. Desktops are typically much faster than Laptops because they are allowed much larger power envelopes, which leads to more cores and higher clock speeds for sustained periods of time. However, there is always a question as to whether your use case will be able to use all 16/32 cores/threads in a 9950X CPU. If not, you may not notice much difference with a smaller processor.
* "people" generally don't spend their time compiling the Linux kernel, or anything of the sort.
* For most daily uses, current-gen CPUs are only marginally faster than two generations back. Not worth spending a large amount of money every 3 years or so.
* Other aspects of your computer, like memory (capacity mostly) and storage, can also be perf bottlenecks.
* If, as a developer, you're repeatedly compiling a large codebase - what you may really want is a build farm rather than the latest-gen CPU on each developer's individual PC/laptop.
[+] [-] joshka|7 months ago|reply
Based on this, I strongly believe that if you're providing hardware for software engineers, it rarely if ever makes sense to buy anything but the top spec Macbook Pro available, and to upgrade every 2-3 years. I can't comment on non desktop / non-mac scenarios or other job families. YMMV.
[+] [-] svantana|7 months ago|reply
[+] [-] socalgal2|7 months ago|reply
[+] [-] znpy|7 months ago|reply
So it wasn't uncommon to see people with a measly old 13" macbook pro doing the hard work on a 64cpu/256GB remote machine. Laptops were essentially machines used for reading/writing emails, writing documents and doing meetings. The IDEs had a proprietary extensions to work with remote machines and the custom tooling.
[+] [-] mvdtnz|7 months ago|reply
God I wish my employers would stop buying me Macbook Pros and let me work on a proper Linux desktop. I'm sick of shitty thermally throttled slow-ass phone chips on serious work machines.
[+] [-] BobbyTables2|7 months ago|reply
Gave developers 16GB RAM and 512MB storage. Spent way too much time worrying about available disk space and needlessly redownloading docker images off the web.
But at least they saved money on hardware expenses!
[+] [-] m463|7 months ago|reply
best money ever spent. lasted years and years.
for cpus - I wonder how the economics work out when you get into say 32 or 64 core threadrippers? I think it still might be worth it.
[+] [-] dehrmann|7 months ago|reply
[+] [-] technofiend|7 months ago|reply
[+] [-] pjc50|7 months ago|reply
[+] [-] myaccountonhn|7 months ago|reply
[+] [-] david38|7 months ago|reply
In reality, you can’t even predict time to project completion accurately. Rarely is a fast computer a “time saver”.
Either it’s a binary “can this run that” or a work environment thing “will the dev get frustrated knowing he has to wait an extra 10 minutes a day when a measly $1k would make this go away”
[+] [-] 2shortplanks|7 months ago|reply
I would agree with the idea that faster compile times can have a significant improvement in performance. 30s is long enough for a developer to get distracted and go off and check their email, look at social media, etc. Basically turning 30s into 3s can keep a developer in flow.
The critical thing we’re missing here is how increasing the CPU speed will decrease the compile time. What if the compiler is IO bound? Or memory bound? Removing one bottleneck will get you to the next bottleneck, not necessarily get you all the performance gains you want
[+] [-] bullen|7 months ago|reply
I still run a 6600 (65W peak) from 2016 as my daily driver. I have replaced the SSD once (MLC lasted 5 years, hopefully forever with SLC drive from 2011?), 2x 32GB DDR4 sticks (Kingston Micron lasted 8 years, with aliexpress "samsung" sticks for $50 a pop) and Monitor (Eizo FlexScan 1932 lasted 15! years RIP with Eizo RadiForce 191M, highly recommend with f.lux/redshift for exceptional quality of image without blue light)
It's still powerful enough to play any games released this year I throw at it at 60 FPS (with a low profile 3050 from 2024) let alone compile any bloat.
Keep your old CPU until it breaks, completely... or actually until the motherboard breaks; I have a Kaby Lake 35W replacement waititng for the 6600 to die.
[+] [-] WarOnPrivacy|7 months ago|reply
The i7-4770 was one. It reliably outperformed later Intel CPUs until near 10th gen or so. I know shops that are still plugging away on them. The first comparable replacements for it is the i7-12700 (but the i5-12400 is a good buy).
At 13th gen, Intel swaps E for P cores. They have their place but I still prefer 12th gen for new desktops.
Past all that, the author is right about the AMD Ryzen 9950x. It's a phenomenal chip. I used one in a friend's custom build (biz, local llm) and it'll be in use in 2035.
[+] [-] mwpmaybe|7 months ago|reply
Per which benchmarks?
> At 13th gen, Intel swaps E for P cores.
One nit, Intel started adding (not swapping) E-cores to desktop parts with 12th gen, but i3 parts and most i5 parts were spared. More desktop i5 parts got them as starting with 13th gen.
[+] [-] zozbot234|7 months ago|reply
[+] [-] jhanschoo|7 months ago|reply
The limiting factor on high-end laptops is their thermal envelope. Get the better CPU as long as it is more power efficient. Then get brands that design proper thermal solutions.
[+] [-] krisroadruck|7 months ago|reply
[+] [-] apt-apt-apt-apt|7 months ago|reply
[+] [-] avidiax|7 months ago|reply
Limiting the number and size of monitors. Putting speedbumps (like assessments or doctor's notes) on ergo accessories. Requiring special approval for powerful hardware. Requiring special approval for travel, and setting hotel and airfare caps that haven't been adjusted for inflation.
To be fair, I know plenty of people that would order the highest spec MacBook just to do web development and open 500 chrome tabs. There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs, which is just a small fraction of one year's salary for a developer.
[+] [-] Aurornis|7 months ago|reply
Then some period of time later they start looking at spending in detail and can’t believe how much is being spent by the 25% or so who abuse the possibly. Then the controls come.
> There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs,
You would think, but in the age of $6,000 fully specced MacBook Pros, $2,000 monitors, $3,000 standing desks, $1500 iPads with $100 Apple pencils and $300 keyboard cases, $1,000 chairs, SaaS licenses that add up, and (if allowed) food delivery services for “special circumstances” that turns into a regular occurrence it was common to see individuals incurring expenses in the tens of thousands range. It’s hard to believe if you’re a person who moderates their own expenditures.
Some people see a company policy as something meant to be exploited until a hidden limit is reached.
There also starts to be some soft fraud at scales higher than you’d imagine: When someone could get a new laptop without questions, old ones started “getting stolen” at a much higher rate. When we offered food delivery for staying late, a lot of people started staying just late enough for the food delivery to arrive while scrolling on their phones and then walking out the door with their meal.
[+] [-] geor9e|7 months ago|reply
[+] [-] loeg|7 months ago|reply
[+] [-] jjmarr|7 months ago|reply
I am 100x more expensive than the laptop. Anything the laptop can do instead of me is something the laptop should be doing instead of me.
[+] [-] kelnos|7 months ago|reply
> But that abuse is really capped out at a few thousand
That abuse easily goes into the tens of thousands of dollars, even several hundred thousand, even at a relatively small shop. I just took a quick look at Apple's store, and wow! The most expensive 14" MacBook Pro I could configure (minus extra software) tops out at a little over $7,000! The cheapest is at $1,600, and a more reasonably-specced, mid-range machine (that is probably perfectly sufficient for dev work), can be had for $2,600.
Let's even round that up to $3,000. That's $4,000 less than the high end. Even just one crazy-specced laptop purchase would max out your "capped out at a few thousand" figure.
And we're maybe not even talking about abuse all the time. An employee might fully earnestly believe that they will be significantly more productive with a spec list that costs $4,000, when in reality that $3,000 will be more or less identical for them.
Multiply these individual choices out to a 20 or 40 or 60 person team, and that's real money, especially for a small startup. And we haven't even started talking about monitors and fancy ergonomic chairs and stuff. 60 people spending on average $2,000 each more than they truly need to spend will cost $120k. (And I've worked at a place that didn't eliminate their "buy whatever you think you'll need" policies until they had more than 150 employees!)
[+] [-] forgotusername6|7 months ago|reply
[+] [-] llbbdd|7 months ago|reply
[+] [-] tgma|7 months ago|reply
Google and Facebook I don't think are cheap for developers. I can speak firsthand for my past Google experience. You have to note that the company has like 200k employees and there needs to be some controls and not all of the company are engineers.
Hardware -> for the vast majority of stuff, you can build with blaze (think bazel) on a build cluster and cache, so local CPU is not as important. Nevertheless, you can easily order other stuff should you need to. Sure, if you go beyond the standard issue, your cost center will be charged and your manager gets an email. I don't think any decent manager would block you. If they do, change teams. Some powerful hardware that needs approval is blanket whitelisted for certain orgs that recognize such need.
Trips -> Google has this interesting model you have a soft cap for trips and if you don't hit the cap, you pocket half of the trips credit in your account which you can choose to spend later when you are overcap or you want to get something slightly nicer the next time. Also, they have clear and sane policies on mixing personal and corporate travel. I encourage everyone to learn about and deploy things like that in their companies. The caps are usually not unreasonable, but if you do hit them, it is again an email to your management chain, not some big deal. Never seen it blocked. If your request is reasonable and your manager is shrugging about this stuff, that should reflect on them being cheap not the company policy.
[+] [-] tnkuehne|7 months ago|reply
[+] [-] benlivengood|7 months ago|reply
We managed to just estimate the lost time and management (in a small startup) was happy to give the most affected developers (about 1/3) 48GB or 64GB MacBooks instead of the default 16GB.
At $100/hr minimum (assuming lost work doesn't block anyone else) it doesn't take long for the upgrades to pay off. The most affected devs were waiting an hour a day sometimes.
This applies to CI/CD pipelines too; it's almost always worth increasing worker CPU/RAM while the reduction in time is scaling anywhere close to linearly, especially because most workers are charged by the minute anyway.
[+] [-] groby_b|7 months ago|reply
I think you wanted to say "especially". You're exchanging clearly measurable amounts of money for something extremely nebulous like "developer productivity". As long as the person responsible for spend has a clear line of view on what devs report, buying hardware is (relatively) easy to justify.
Once the hardware comes out of a completely different cost center - a 1% savings for that cost center is promotion-worthy, and you'll never be able to measure a 1% productivity drop in devs. It'll look like free money.
[+] [-] WalterBright|7 months ago|reply
P.S. you can buy a satellite monitor often for $10 from the thrift store. The one I bought was $10.
I don't buy used keyboards because they are dirty and impossible to clean.
[+] [-] jacobolus|7 months ago|reply
Why is that abuse? Having many open browser tabs is perfectly legitimate.
Arguably they should switch from Chrome to Safari / lobby Google to care about client-side resource use, but getting as much RAM as possible also seems fine.
[+] [-] diminish|7 months ago|reply
Single thread performance of 16-core AMD Ryzen 9 9950X is only 1.8x of my poor and old laptop's 4-core i5 performance. https://www.cpubenchmark.net/compare/6211vs3830vs3947/AMD-Ry...
I'm waiting for >1024 core ARM desktops, with >1TB of unified gpu memory to be able to run some large LLMs with
Ping me when some builds this :)
[+] [-] zh3|7 months ago|reply
Also got a Mac Mini M4 recently and that thing feels slow in comparison to both these systems - likely more of a UI/software thing (only use M4 for xcode) than being down to raw CPU performance.
[0] https://www.cpubenchmark.net/compare/Intel-i9-9900K-vs-Intel...
[+] [-] torginus|7 months ago|reply
[+] [-] beezle|7 months ago|reply
From dual pentium pros to my current desktop - Xeon E3-1245 v3 @ 3.40GHz built with 32 GB top end ram in late 2012 which has only recently started to feel a little pokey, I think largely due to cpu security mitigations added to Windows over the years.
So that extra few hundred up front gets me many years extra on the backend.
[+] [-] kaspar030|7 months ago|reply
I wish that were true, but the current Ryzen 9950 is maybe 50% faster than the two generations older 5950, at compilation workloads.
[+] [-] sgarland|7 months ago|reply
[+] [-] defanor|7 months ago|reply
> If you can justify an AI coding subscription, you can justify buying the best tool for the job.
I personally can justify neither, but not seeing how one translates into another: is a faster CPU supposed to replace such a subscription? I thought those are more about large and closed models, and that GPUs would be more cost-effective as such a replacement anyway. And if it is not, it is quite a stretch to assume that all those who sufficiently benefit from a subscription would benefit at least as much from a faster CPU.
Besides, usually it is not simply "a faster CPU": sockets and chipsets keep changing, so that would also be a new motherboard, new CPU cooler, likely new memory, which is basically a new computer.
[0] https://www.cpubenchmark.net/
[1] https://www.tomshardware.com/pc-components/cpus
[+] [-] userbinator|7 months ago|reply
[+] [-] miiiiiike|7 months ago|reply
I spent a few grand building a new machine with a 24-core CPU. And, while my gcc Docker builds are MUCH faster, the core Angular app still builds a few seconds slower than on my years old MacBook Pro. Even with all of my libraries split into atoms, built with Turbo, and other optimizations.
6-10 seconds to see a CSS change make its way from the editor to the browser is excruciating after a few hours, days, weeks, months, and years.
[+] [-] poink|7 months ago|reply
If you have a bad machine get a good machine, but you’re not going to get a significant uplift going from a good machine that’s a few years old to the latest shiny
[+] [-] blueflow|7 months ago|reply
[+] [-] fxtentacle|7 months ago|reply
„Public whipping for companies who don’t parallelize their code base“ would probably help more. ;)
Anyway, how many seconds does MS Teams need to boot on a top of the line CPU?
[+] [-] dlevine|7 months ago|reply
As for Desktop vs Laptop, that is relevant too. Desktops are typically much faster than Laptops because they are allowed much larger power envelopes, which leads to more cores and higher clock speeds for sustained periods of time. However, there is always a question as to whether your use case will be able to use all 16/32 cores/threads in a 9950X CPU. If not, you may not notice much difference with a smaller processor.
Source for CPU benchmarks: https://www.cpubenchmark.net/compare/6211vs5031vs3862vs5717/...
[+] [-] DrNosferatu|7 months ago|reply
Considering ‘Geekbench 6’ scores, at least.
So if it’s not a task massively benefiting from parallelization, buying used is still the best value for money.
[+] [-] JSR_FDED|7 months ago|reply
Maybe that’s an AMD (or even Intel) thing, but doesn’t hold for Apple silicon.
I wonder if it holds for ARM in general?
[+] [-] einpoklum|7 months ago|reply
* "people" generally don't spend their time compiling the Linux kernel, or anything of the sort.
* For most daily uses, current-gen CPUs are only marginally faster than two generations back. Not worth spending a large amount of money every 3 years or so.
* Other aspects of your computer, like memory (capacity mostly) and storage, can also be perf bottlenecks.
* If, as a developer, you're repeatedly compiling a large codebase - what you may really want is a build farm rather than the latest-gen CPU on each developer's individual PC/laptop.