* The fact that there are comments misunderstanding the article, that are talking about PCB Design rather than (Silicon) Chip Design, speaks to the problem facing the chip industry. Total lack of wider awareness and many misunderstandings.
* Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
* Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
* Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
* We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.
* The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.
> * Chip design pays better than software in many cases and many places (US and UK included;
Where are these companies? All you ever hear from the hardware side of things are that the tools suck, everyone makes you sign NDAs for everything and that the pay is around 30% less. You can come up with counterexamples like Nvidia I suppose, but that's a bit like saying just work for a startup that becomes a billion dollar unicorn.
If these well paying jobs truly exist (which I'm going to be honest I doubt quite a bit) the companies offering them seem to be doing a horrendous job advertising that fact.
The same seems to apply to software jobs in the embedded world as well, which seem to be consistently paid less then web developers despite arguably having a more difficult job.
> The fact that there are comments misunderstanding the article, that are talking about PCB Design rather than (Silicon) Chip Design, speaks to the problem facing the chip industry. Total lack of wider awareness and many misunderstandings.
No, there is no misunderstanding. Even the US companies mentioned _in the very article_ that have both software and "chip design" roles (however you call it) will pay more to their software engineers. I have almost never heard of anyone moving from software to the design side, but rather most people move from design side to software which seems like the more natural path.
I knew a guy who was a digital verification engineer for intel. He was unceremoniously laid off and ended up taking work doing some sort of compliance at a very low paying state agency I was a developer at.
Pretty sharp guy, we worked together a few times on problems far outside both our responsibilities/domain. I always wondered why he ended up taking that gig. Must have been horrible doing compliance work on what was likely at least a 100% pay cut.
Software pays better, which is why so many hardware people switched, including myself. In my group, which is mixed between the two, my software job classification nets me a higher bonus and easier promotions
Edit: also never have to stay late to rework components on dozens of eval boards, and also never have to talk with manufacturers 10 timezones away
> * Chip design pays better than software in many cases
You are comparing the narrowest niche of hardware engineering to the broad software profession overall?
> (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
How many hardware jobs are in the finance/fintech sector? I've never anyone working on hardware in finance nor have I seen a job posting for one. And I doubt the highest paid hardware engineer is making remotely close to what the highest paid software engineer in finance is making.
> but I think it is more a symptom than it is a cause.
Or parents, industry professionals, college professors/advisors, etc advise students on future job prospects and students choose accordingly.
> * The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.
To this, I would point to librelane/yosys/TinyTapeout/waferspace and say there are quite a bit of opportunities to learn stuff and there are oss initiative trying to _do stuff_ in this field. I wouldn't know how it applies to the wider industry, but the ecosystem deff piqued my interest. I do write quite a bit of embedded systems in my day to day though, so I got a rough idea what is in a chip. Would love to have the time to dive deeper.
I was going to see if I could quote some job postings from my employer to compare this, and then discovered that even the intranet jobs board does not have salary ranges posted. Sigh. Going to have to feed that back to someone.
> Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
> Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
Yes. These are much more specific skills than HN expects, you need an EE degree or equivalent to do analogue IC design while you do not to do software.
However I think the very specific-ness is a problem. If you train yourself in React you might not have the highest possible salary but you'll never be short of job postings. There are really not a lot of analogue designers, they have fairly low turnover, and you would need to work in specific locations. If the industry contracts you are in trouble.
I think there are two separate areas of concern here, hardware, and computation. I strongly believe that a Computer Science program that only includes variants of the Von Neumann model of computation is severely lacking. While it's interesting to think about Turing Machines and Church numbers, etc... the practical use of FPGAs and other non-CPU based logic should definitely be part of the modern CS education.
The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.
Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.
I have trouble believing there's a talent shortage in the chip industry. Lots of ECE grads I know never really found jobs and moved on to other things (including SWE). Others took major detours to eventually get jobs at places like Intel.
Not all hardware is digital but we can solve most of the hard parts in the digital domain. There's no reason to do everything analog just because it starts or ends that way.
A lot of RF design has been reduced to arrays of dumb antennas that are wired together in software. Starlink is probably the best example of this right now.
You still need people who can build the analog systems and engineer the nasty parts of the signal chain, but you don't need a lot of them.
The subheading to this article seems a little extreme: "To fill the talent gap, CS majors could be taught to design hardware, and the EE curriculum could be adapted or even shortened."
The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.
Weird article, came to it hoping to see if I could train into a new job. But instead it went on and on about AI for almost the entire piece. Never learned what classes I might need to take or what the job prospects are.
As a computer engineer I usually copy reference schematics and board layouts from datasheets the vendors offers. 95% of my hardware problems can be solved with it.
Learning KiCad took me a few evenings with YT videos (greetings to Phil!).
Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.
Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.
But as always: the better your gear gets - the more fun it becomes.
Even as a professional EE working on high speed digital and mixed signal designs (smartphones and motherboards), I used reference designs all the time, for almost every major part in a design. We had to rip up the schematics to fit our needs and follow manufacturer routing guidelines rather than copying the layout wholesale, but unless simulations told us otherwise we followed them religiously. When I started I was surprised how much of the industry is just doing the tedious work of footprint verification and PCB routing after copying existing designs and using calculators like the Saturn toolkit.
The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.
Why would they? Pay is just much lower, despite the fact that there's way more responsability. I personally know more people who switch from hardware to software than viceversa.
I'd do anything short of murder to get out of software. If I could find a career that paid enough to live somewhere nice and didn't have the horrible working conditions that software does (stack rank, fake agile, unrealistic deadlines, stack rank, etc.) I'd do it in a heartbeat.
I have a MsC in CS. While I spent half of my career writing device drivers, the other half was doing computer architecture. You could say I had a foot on the low level software side, and the other foot on the high-level hardware side. I found them to be two sides of the same coin. Understanding how hardware folks see the world took a few years, but it was very doable.
My biggest gripe with the semiconductor industry as a career, compared to software, is twofold.
First, it is very concentrated. If you want to make good money there are only a handful of potential employers, and this only a handful of cities/neighborhoods where you will have to live; remote work is theoretically possible but not all employers make it effective. I found this the most frustrating. The upside is that people know this and this tend to stay at the same employer for a long time, so you get to learn from people with a deep understanding of the product, and people are mindful to keep a pleasant work environment.
Second, the pay isn't as good at the top end. If you have FAANG-level skills, you will typically do much better financially there than in the semiconductor industry —with the notable exception of NVidia for the past decade or so.
Obviously. Hardware designers absolutely love to think that hardware design is totally different to software design and only they have the skills, but in reality it's barely diffetent. Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.
> The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.
In fact I'll go further - in my experience people with a software background make much better hardware designers than people with an EE background because they are aware of modern software best practices. Many hardware designers are happy to hack whatever together with duck tape and glue. As a result most of the hardware industry is decades behind the software industry in many ways, e.g. still relying on hacky Perl and TCL scripts to cobble things together.
The notable exceptions are:
* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.
>> Stuff runs in parallel. You occasionally have to know about really hardware things like timing and metastability. But the venn diagram of hardware/software design skills is pretty much two identical circles.
I don't know your background, but this feels like from someone who hasn't worked on both the aspects for a non-trivial industry project. The thing is software spans a huge range - web FE/BE, GUI, Database, networking, os, compiler, hpc, embedded etc. Not all of them have the same background to be a good HW designer. Sure you can design HW as if you are writing software, but it won't be production worthy - not when you are pushing the boundaries.
My work straddles both HW architecture and SW. I design processors, custom ISA optimized for SW application algorithms, and ensuring optimized micro-architecture implementation on the HW side to meet the PPA. I sit at the intersection of HW, SW and verification. People like me are rare, not just in my company but in the industry. Things fall through the gap, if you don't have someone to bridge it and then you have a sub-optimal design.
I don't deny that SW people cannot learn HW design, there is nothing magical after all; just hardwork and practice. But to say that the venn diagram is two identical circle is plain wrong. The cognitive load to shuttle up and down the two HW/SW stacks is a lot more than either of them.
UIUC CS grad from the late 80s. CS students had to take a track of electrical engineering courses. Physics E&M, intro EE, digital circuits, microprocessor/ALU design, microprocessor interfacing.... It paid off immensely in my embedded development career.
I'm guessing this isn't part of most curricula anymore?
At UC Berkeley in the early-mid 90s, I think I had two digital design courses. The first was low level basics like understanding logic gates, flip flops, gray coding, PROM, ALUs, multiplexers, etc., with a physical project using 7000-series chips on breadboard. The second was the whole 32 bit MIPS/SPIM pipelined CPU design and simulation project based on the Patterson and Hennessy text book.
But, I seem to recall there were ways to bypass most hardware background knowledge for a CS degree. You had to do intro math and physics that did classical mechanics, but you could stop short of most of the electromagnetic stuff or multivariate calculus. You could get your breadth credits in other areas like statistics, philosophy, and biology. I think you could also bypass digital design with mix of other CS intro courses like algorithms, operating systems, compilers, graphics, database systems, and maybe AI?
> I'm guessing this isn't part of most curricula anymore
My sibling is a CS@UIUC grad and they as well as CS+X were still required to do that.
In other universities such as Cal it's a different story. Systems programming and computer architecture course requirements have either been significantly reduced or eliminated entirely in CS programs over the past decade.
I've documented this change before on HN [0][1][2]. The CS major has been increasingly deskilled in the US.
I had to take computer architecture. We made a 4 bit CPU... or maybe it was 8 bit. I can't remember. But it was all in a software breadboard simulator thing. LogicWorks.
I wasn't taught directly (and don't know what I'm doing still), but I've had a lot of fun learning about retro hardware design as a software engineer. I've made a few of my own reverse engineered designs, trying to synthesize how the real designers would have built the chip at the time, and ported others for the Analogue Pocket and MiSTer project.
I had written a whole big thing that could be summarized as "yes, of course" but then I read the article and realized that it is very specifically about designing silicon, not devices.
I understand that it makes sense for a blog called Semiconductor Engineering to be focused on semiconductor engineering, but I was caught off guard because I have been working on the reasonable assumption that "hardware designer" could be someone who... designs hardware, as in devices containing PCBs.
In the same way that not all software developers want to build libraries and/or compilers, surely not all hardware designers want to get hired at [big chip company] to design chips.
It is funny how "hardware design" is commonly used in the chip industry to describe what semiconductor design/verification engineers do. And then there's PCB designers using those same chips in their _hardware designs_.
Also there's computer architects being like "So, are we hardware design? Interface design? Software? Something else?"...
Meanwhile, all the mechanical engineers are looking from the outside saying "The slightest scratch and your 'hard'ware is dead. Not so 'hard' really, eh?" ;) ;)
Every sector has its nomenclature and sometimes sectors bump into each other. SemiEngineering is very much in the chip design space.
I have a degree in EE (2016) and am doing mostly ML engineering with a considerable amount of SWE tasks in my day-to-day.
Of my graduating class, very few are designing hardware. Most are writing code in one form or another. There were very few jobs available in EE that didn't underpay and lock you into an antiquated skillset, whether in renewables/MRI/nuclear/control etc.
We had enough exposure to emerging growth areas (computer vision, reinforcement learning, GPUs) to learn useful skills, and those all had free and open source systems to study after graduation, unlike chip design.
The company sponsoring this article is a contributor to that status quo. The complete lack of grassroots support for custom chips in North America, including a dearth of open source design tools or a community around them, has made it a complete non-starter for upskilling. Nobody graduates from an EE undergrad with real capability in the chip design field, so unless you did graduate studies, you probably just ended up learning more and more software skills.
But the relentless off-shoring of hardware manufacturing is likely the ultimate cause. These days, most interesting EE roles I see require fluency in Mandarin.
I'm a hardware designer. An EE. But over the last umpteen years I've gradually switched over to software because that's where I was needed. What I've found is that I became a very good software programmer but I still lack all the fundamentals of software engineering. There are things I won't or can't use because it would require too much study for me to get good at it or even understand it.
I would bet that a CS guy would have similar problems switching to hardware engineering.
I lived in both worlds (hardware/software) throughout my career. In school, I learned (in order): Analog electronics (including RF), Digital electronics, Microprocessors, Software, Systems. I've always thought that it's strange how few software people know hardware, and vice versa. In the software domain, when I began referencing hardware elements while explaining something, the software audience would usually just glaze over and act like they were incapable of understanding. Same goes for the hardware people when I would reference software elements.
I learned Ada sometime around 1991. Counting assembly for various platforms, I had already learned about a dozen other languages by then, and would later learn many more.
Sometime around 2000 I learned VHDL. In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada. I wish somebody had just produced a textbook describing the additional features and nomenclatures that VHDL added to Ada -- That would have made learning it even easier. The obvious reason that nobody had done that is that I was among a very small minority of hardware people who already knew Ada, and it just wouldn't be useful to most people.
In all of my work, but especially in systems integration work, I've found that my knowledge of multiple domains has really helped me outperform my peers. Having an understanding of what the computer is doing at the machine level, as well as what the software is doing (or trying to do) can make the integration work easy.
More on-topic:
I think it would be a great improvement to add some basic hardware elements to CS software courses, and to add some basic CS elements to EE courses. It would benefit everyone.
> In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada
Really? That's kind of the point of VHDL, isn't it? (vs. Verilog's unholy combination of C-like syntax with begin/end blocks, etc.)
VHDL also inherits Ada's module style, designed to have different implementations of the same thing (and verbosity, where it seems like you often have to say the same thing repeatedly, for better or for worse - more type checking at the expense of more typing at the keyboard.)
>> "Either we hire good CS people who have the basic understanding of EE, and we train them to become good engineers, or we hire good engineers who are good in CS, and we try to upskill them on the CS side."
The former (CS -> EE) is very unlikely to happen at a large scale than the latter (EE -> CS). It is much easier to teach EEs to become (albeit, often bad) software engineers, than teaching CS student to be good engineers.
Also, the former (CS -> EE) will not happen in academia because of (1) turf wars, and (2) CS faculty not having any understanding, nor interest in electronics/hardware/engineering.
I once proposed to teach an IoT class in the CS department of a major university in US, the proposal basically fell on deaf ears.
Hardware is artificially underpaid work, good positions are sparse in the US, and generally most engineers end up in niche coding environments.
Most people that land a successful long career, also refuse to solve some clown firms ephemeral problems at a loss. The trend of externalizing costs onto perspective employees starts to fail in difficult fields requiring actual domain talent with $3.7m per seat equipment. Regulatory capture also fails in advanced areas, as large firms regress into state sponsored thievery instead.
digital circuit design strikes me as a risky gambit for a career, given that almost everyone who ive bumped into in that industry was invariable not actually doing any design, but rather was tasked with writing test cases and verifying the functionality of some specific logical block.
tests are ofcourse very important, but fact of the matter is, bright smart and arrogant young engineers-to-be are very eager to show everyone how much better their version of the 'thing' is, and desperately want to write their version of the thing: they don't want to verify someone else's version of the thing.
if we're being honest, how many people do you really need to do the design of some hardware feature? realistically the design can be done by one person.
so you might have one lead designer, delegates each block to 10 guys, and everything else is basically 'monkey work' of writing up the state machine logic, testing it, and hooking it all up.
and now lets count the number of companies that can put up the capital for tape-out: amd, intel, arm, nvidia, meta, aws, google chips, apple, and lets say plus 50 for fintechs, startups, and other 'smaller' orgs.
so if you want to do design, you might be competing for... lets say 3 lead designers per org on avg, 3 * 50 = 150 silicon design spots for the entire globe. to add, a resource in such scarce supply will no doubt be heavily guarded by its occupants.
i did this calculation back when i was still in uni. i'll never know if it paid off, or if it was even rooted in logic, but i remember thinking to myself back then: "no way in hell am i gonna let these old guys pidgeon hole me into doing monkey work with a promise of future design opportunities." arrogant, yes, but i can't say i regret my decision judging from the anecdotes i get from friends in the hardware world.
> and now lets count the number of companies that can put up the capital for tape-out: amd, intel, arm, nvidia, meta, aws, google chips, apple, and lets say plus 50 for fintechs, startups, and other 'smaller' orgs.
And basically anyone who has a job in tech [1] or someone who just pulled their salary out of the ATM has enough money to do a tapeout with the cash in their hand [2] or chinese students for basically free[3]. Of course, for _some_ scopes of tapeout. These are older nodes and you have limited area. But you might not need anything fancy for your design.
The rest of the post, I think has a bunch of misunderstandings or wrong facts, but I don't work in the field, (ish) so I might be as clueless as you and I need to get back to my day job so I won't try countering you just yet.
i think that, for digital design to be interesting, the cost of entry must be lowered by probably orders upon orders of magnitude.
the google skywaterpdk thing, whatever it is (or was?), did produce a great deal of hobbyist designs and proved that there really isn't anything special about rtl - infact, its really quite monotonous and boring.
which is a good attitude to have, really. lots of hobbyist designs got cranked out quickly on what, as i understood, was a very obsolete pdk from two decades ago.
but its fundamentally still too expensive and too limited. open source software 'blew up' because
1. the cost of entry was free...
2. ...for state of the art tools.
its not enough to be free, or open source. it also has to be competitive. llvm/gcc won the compiler world because they blew the codegen of proprietary compilers out of the water, ofcourse being open source it became a positive feedback loop of lots of expert eyeballs -> better compiler -> more experts look at it -> better compiler -> ...
for digital design to become interesting, you can't trick the kids: they want the same tech the 'big boys' are using. so, what scope is there to make it economical for someone like Intel carving out some space for a no-strings-attached digital design lottery?
i get the impression that, unlike for most manufacturing processes, the costs of silicon digital electronics increases every year, and the amortisation schedule becomes bigger, not smaller.
so if anything, it seems that the more high tech silicon manufacturing becomes, the smaller the pool of players (who have the ever-increasing capital expenditure necessary) becomes, which should indicate that the opportunities for digital design work are actually going to be shrinking as time goes on.
Is the idea here that the code-generation apocalypse will leave us with a huge surplus of software folks? Enabling software people to go over to hardware seems to be putting the cart before the horse, otherwise.
Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).
After having reviewed multiple RISC-V core generators, I suspect it is easier to teach a computer science student to design hardware than it is to teach an electrical engineering student to design software.
Hilarious to see Cadence and Synopsys in this article. They are arguably the cause. The complete lack of open source tooling and their agressive tooling price is the exact reason this ecosystem continues to be an absolute dumpster fire.
I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.
I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.
Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.
Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.
I know you said it in jest, but there is a strong justification for cross-feeding the two disciplines - on one side, we might get hardware that’s easier to program and, on the other end, we might get software that’s better tuned to the hardware it runs on.
Working in EE post BSc in EE from 99-06, it's pretty much CS + I know how to bread board and solder if absolutely necessary.
A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.
Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).
Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.
My degree is in computer science but I studied at the faculty of electrical engineering.
My courses didn't get into the details of semiconductor design (particularly manufacturing), but we had one on the physical principles behind this whole thing - bandgaps and all.
We also had to design analog circuits using the Ebers-Moll transistor model, so pretty basic, but still not exactly linear.
Overall these are very different fields but at the end of the day they both have models and systems, so you could make a student of one of them learn the other and vice versa.
EdNutting|12 days ago
* Chip design pays better than software in many cases and many places (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
* Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
* Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
* We have a shortage of engineers in the chip industry, particularly in chip design and verification, but also architecture, modelling/simulation, and low-level software. Unfortunately, the decline in hardware courses in academia is very long standing, and AI Software is just the latest fuel on the fire. AI Hardware has inspired some new people to join the industry but nothing like the tidal wave of new software engineers.
* The lack of open source hardware tools, workflows, high-quality examples, relative to the gross abundance of open source software, doesn’t help the situation, but I think it is more a symptom than it is a cause.
Tharre|12 days ago
Where are these companies? All you ever hear from the hardware side of things are that the tools suck, everyone makes you sign NDAs for everything and that the pay is around 30% less. You can come up with counterexamples like Nvidia I suppose, but that's a bit like saying just work for a startup that becomes a billion dollar unicorn.
If these well paying jobs truly exist (which I'm going to be honest I doubt quite a bit) the companies offering them seem to be doing a horrendous job advertising that fact.
The same seems to apply to software jobs in the embedded world as well, which seem to be consistently paid less then web developers despite arguably having a more difficult job.
AshamedCaptain|12 days ago
No, there is no misunderstanding. Even the US companies mentioned _in the very article_ that have both software and "chip design" roles (however you call it) will pay more to their software engineers. I have almost never heard of anyone moving from software to the design side, but rather most people move from design side to software which seems like the more natural path.
butterbomb|11 days ago
Pretty sharp guy, we worked together a few times on problems far outside both our responsibilities/domain. I always wondered why he ended up taking that gig. Must have been horrible doing compliance work on what was likely at least a 100% pay cut.
georgeburdell|11 days ago
Edit: also never have to stay late to rework components on dozens of eval boards, and also never have to talk with manufacturers 10 timezones away
jvanderbot|12 days ago
hearsathought|11 days ago
You are comparing the narrowest niche of hardware engineering to the broad software profession overall?
> (US and UK included; but excluding comparisons to Finance/FinTech software, unless you happen to be in hardware for those two sectors)
How many hardware jobs are in the finance/fintech sector? I've never anyone working on hardware in finance nor have I seen a job posting for one. And I doubt the highest paid hardware engineer is making remotely close to what the highest paid software engineer in finance is making.
> but I think it is more a symptom than it is a cause.
Or parents, industry professionals, college professors/advisors, etc advise students on future job prospects and students choose accordingly.
RealityVoid|11 days ago
To this, I would point to librelane/yosys/TinyTapeout/waferspace and say there are quite a bit of opportunities to learn stuff and there are oss initiative trying to _do stuff_ in this field. I wouldn't know how it applies to the wider industry, but the ecosystem deff piqued my interest. I do write quite a bit of embedded systems in my day to day though, so I got a rough idea what is in a chip. Would love to have the time to dive deeper.
unknown|12 days ago
[deleted]
pjc50|11 days ago
> Software engineers make great digital logic verification engineers. They can also gradually be trained to do design too. There are significant and valuable skill and knowledge crossovers.
> Software engineers lack the knowledge to learn analogue design / verification, and there’s little to no knowledge-crossover.
Yes. These are much more specific skills than HN expects, you need an EE degree or equivalent to do analogue IC design while you do not to do software.
However I think the very specific-ness is a problem. If you train yourself in React you might not have the highest possible salary but you'll never be short of job postings. There are really not a lot of analogue designers, they have fairly low turnover, and you would need to work in specific locations. If the industry contracts you are in trouble.
culopatin|11 days ago
epolanski|11 days ago
mikewarot|12 days ago
The vagaries of analog electronics, RF, noise, and the rest is another matter. While it's possible that a CS graduate might have a hint of how much they don't know, it's unreasonable to expect them to cover that territory as well.
Simple example, did you know that it's possible for 2 otherwise identical resistors to have more than 20db differences in their noise generation?[1] I've been messing with electronics and ham radio for 50+ years, and it was news to me. I'm not sure even a EE graduate would be aware of that.
[1] https://www.youtube.com/watch?v=omn_Lh0MLA4&t=445s
blibble|12 days ago
they even made us use them in practical labs, and connect them up to an ARM chip
Glyptodon|12 days ago
KRAKRISMOTT|12 days ago
realo|12 days ago
RF design, radars, etc... are more an art than a science, in many aspects.
I would expect a Physics-trained student to be more adaptable to that type of EE work than a CS student...
bob1029|11 days ago
A lot of RF design has been reduced to arrays of dumb antennas that are wired together in software. Starlink is probably the best example of this right now.
You still need people who can build the analog systems and engineer the nasty parts of the signal chain, but you don't need a lot of them.
cracki|12 days ago
stn8188|12 days ago
The article is more in the area of chip design and verification than PCB hardware, so I kinda understand where it's coming from.
mixmastamyk|12 days ago
em3rgent0rdr|12 days ago
"Electrical and Computer Engineering" (ECE) departments already exist and already have such a major: "Computer Engineering".
NoiseBert69|12 days ago
Learning KiCad took me a few evenings with YT videos (greetings to Phil!).
Soldering needs much more exercise. Soldering QFN with a stencil, paste and oven (or only pre-heater) can only be learned by failing many times.
Having a huge stock of good components (sorted nicely with PartsDB!) lowers the barrier for starting projects dramatically.
But as always: the better your gear gets - the more fun it becomes.
throwup238|12 days ago
The exception was cutting edge motherboards that had to be released alongside a new Intel chipset but that project had at least a dozen engineers working in shifts.
AshamedCaptain|12 days ago
harimau777|12 days ago
david-gpu|11 days ago
My biggest gripe with the semiconductor industry as a career, compared to software, is twofold.
First, it is very concentrated. If you want to make good money there are only a handful of potential employers, and this only a handful of cities/neighborhoods where you will have to live; remote work is theoretically possible but not all employers make it effective. I found this the most frustrating. The upside is that people know this and this tend to stay at the same employer for a long time, so you get to learn from people with a deep understanding of the product, and people are mindful to keep a pleasant work environment.
Second, the pay isn't as good at the top end. If you have FAANG-level skills, you will typically do much better financially there than in the semiconductor industry —with the notable exception of NVidia for the past decade or so.
IshKebab|12 days ago
The reason for the "talent shortage" (aka "talent more expensive than we'd like") is really just because hardware design is a niche field that most people a) don't need to do, b) can't access because almost all the tools are proprietary and c) can't afford, outside of tiny FPGAs.
If Intel or AMD ever release a CPU range that comes with an eFPGA as standard that's fully documented with free tooling then you'll suddenly see a lot more talent appear as if by magic.
elektronika|12 days ago
Mostly B. Even if you work in company that does both you'll rarely get a chance to touch the hardware as a software developer because all the EDA tools are seat-licensed, making it an expensive gamble to let someone who doesn't have domain experience take a crack at it. If you work at a verilog shop you can sneak in verilator, but the digital designers tend to push back in favor of vendor tools.
IshKebab|12 days ago
The notable exceptions are:
* Formal verification, which is very widely used in hardware and barely used in software (not software's fault really - there are good reasons for it).
* What the software guys now call "deterministic system testing", which is just called "testing" in the hardware world because that's how it has always been done.
sifar|12 days ago
I don't know your background, but this feels like from someone who hasn't worked on both the aspects for a non-trivial industry project. The thing is software spans a huge range - web FE/BE, GUI, Database, networking, os, compiler, hpc, embedded etc. Not all of them have the same background to be a good HW designer. Sure you can design HW as if you are writing software, but it won't be production worthy - not when you are pushing the boundaries.
My work straddles both HW architecture and SW. I design processors, custom ISA optimized for SW application algorithms, and ensuring optimized micro-architecture implementation on the HW side to meet the PPA. I sit at the intersection of HW, SW and verification. People like me are rare, not just in my company but in the industry. Things fall through the gap, if you don't have someone to bridge it and then you have a sub-optimal design.
I don't deny that SW people cannot learn HW design, there is nothing magical after all; just hardwork and practice. But to say that the venn diagram is two identical circle is plain wrong. The cognitive load to shuttle up and down the two HW/SW stacks is a lot more than either of them.
acuozzo|12 days ago
Which doesn't pay as well as jobs in software do, unfortunately.
joezydeco|12 days ago
I'm guessing this isn't part of most curricula anymore?
saltcured|12 days ago
But, I seem to recall there were ways to bypass most hardware background knowledge for a CS degree. You had to do intro math and physics that did classical mechanics, but you could stop short of most of the electromagnetic stuff or multivariate calculus. You could get your breadth credits in other areas like statistics, philosophy, and biology. I think you could also bypass digital design with mix of other CS intro courses like algorithms, operating systems, compilers, graphics, database systems, and maybe AI?
alephnerd|12 days ago
My sibling is a CS@UIUC grad and they as well as CS+X were still required to do that.
In other universities such as Cal it's a different story. Systems programming and computer architecture course requirements have either been significantly reduced or eliminated entirely in CS programs over the past decade.
I've documented this change before on HN [0][1][2]. The CS major has been increasingly deskilled in the US.
[0] - https://news.ycombinator.com/item?id=45413516
[1] - https://news.ycombinator.com/item?id=45404647
[2] - https://news.ycombinator.com/item?id=45397327
wmichelin|12 days ago
em3rgent0rdr|12 days ago
cracki|12 days ago
Definitely no ALU design on the curriculum, no interfacing or busses, very little physics. They don't even put a multimeter in your hand.
Informatics is considered a branch of logic. If you want to know how to design a computer, you should have studied EE, is their thinking.
agg23|12 days ago
Here's an example of my implementation of the original Tamagotchi: https://news.ycombinator.com/item?id=45737872 (https://github.com/agg23/fpga-tamagotchi)
peteforde|12 days ago
I understand that it makes sense for a blog called Semiconductor Engineering to be focused on semiconductor engineering, but I was caught off guard because I have been working on the reasonable assumption that "hardware designer" could be someone who... designs hardware, as in devices containing PCBs.
In the same way that not all software developers want to build libraries and/or compilers, surely not all hardware designers want to get hired at [big chip company] to design chips.
EdNutting|12 days ago
Also there's computer architects being like "So, are we hardware design? Interface design? Software? Something else?"...
Meanwhile, all the mechanical engineers are looking from the outside saying "The slightest scratch and your 'hard'ware is dead. Not so 'hard' really, eh?" ;) ;)
Every sector has its nomenclature and sometimes sectors bump into each other. SemiEngineering is very much in the chip design space.
skoocda|11 days ago
Of my graduating class, very few are designing hardware. Most are writing code in one form or another. There were very few jobs available in EE that didn't underpay and lock you into an antiquated skillset, whether in renewables/MRI/nuclear/control etc.
We had enough exposure to emerging growth areas (computer vision, reinforcement learning, GPUs) to learn useful skills, and those all had free and open source systems to study after graduation, unlike chip design.
The company sponsoring this article is a contributor to that status quo. The complete lack of grassroots support for custom chips in North America, including a dearth of open source design tools or a community around them, has made it a complete non-starter for upskilling. Nobody graduates from an EE undergrad with real capability in the chip design field, so unless you did graduate studies, you probably just ended up learning more and more software skills.
But the relentless off-shoring of hardware manufacturing is likely the ultimate cause. These days, most interesting EE roles I see require fluency in Mandarin.
assimpleaspossi|12 days ago
I would bet that a CS guy would have similar problems switching to hardware engineering.
mixmastamyk|12 days ago
Curious as to what that is?
anonymousiam|12 days ago
I learned Ada sometime around 1991. Counting assembly for various platforms, I had already learned about a dozen other languages by then, and would later learn many more.
Sometime around 2000 I learned VHDL. In all of the material (two textbooks and numerous handouts) there was no mention of the obvious similarities to Ada. I wish somebody had just produced a textbook describing the additional features and nomenclatures that VHDL added to Ada -- That would have made learning it even easier. The obvious reason that nobody had done that is that I was among a very small minority of hardware people who already knew Ada, and it just wouldn't be useful to most people.
In all of my work, but especially in systems integration work, I've found that my knowledge of multiple domains has really helped me outperform my peers. Having an understanding of what the computer is doing at the machine level, as well as what the software is doing (or trying to do) can make the integration work easy.
More on-topic: I think it would be a great improvement to add some basic hardware elements to CS software courses, and to add some basic CS elements to EE courses. It would benefit everyone.
musicale|11 days ago
Really? That's kind of the point of VHDL, isn't it? (vs. Verilog's unholy combination of C-like syntax with begin/end blocks, etc.)
VHDL also inherits Ada's module style, designed to have different implementations of the same thing (and verbosity, where it seems like you often have to say the same thing repeatedly, for better or for worse - more type checking at the expense of more typing at the keyboard.)
bsoles|12 days ago
The former (CS -> EE) is very unlikely to happen at a large scale than the latter (EE -> CS). It is much easier to teach EEs to become (albeit, often bad) software engineers, than teaching CS student to be good engineers.
Also, the former (CS -> EE) will not happen in academia because of (1) turf wars, and (2) CS faculty not having any understanding, nor interest in electronics/hardware/engineering.
I once proposed to teach an IoT class in the CS department of a major university in US, the proposal basically fell on deaf ears.
Joel_Mckay|12 days ago
Most people that land a successful long career, also refuse to solve some clown firms ephemeral problems at a loss. The trend of externalizing costs onto perspective employees starts to fail in difficult fields requiring actual domain talent with $3.7m per seat equipment. Regulatory capture also fails in advanced areas, as large firms regress into state sponsored thievery instead.
Advice to students that is funny and accurate =3
"Mike Monteiro: F*ck You, Pay Me"
https://www.youtube.com/watch?v=jVkLVRt6c1U
contubernio|12 days ago
anthk|12 days ago
webdevver|12 days ago
tests are ofcourse very important, but fact of the matter is, bright smart and arrogant young engineers-to-be are very eager to show everyone how much better their version of the 'thing' is, and desperately want to write their version of the thing: they don't want to verify someone else's version of the thing.
if we're being honest, how many people do you really need to do the design of some hardware feature? realistically the design can be done by one person.
so you might have one lead designer, delegates each block to 10 guys, and everything else is basically 'monkey work' of writing up the state machine logic, testing it, and hooking it all up.
and now lets count the number of companies that can put up the capital for tape-out: amd, intel, arm, nvidia, meta, aws, google chips, apple, and lets say plus 50 for fintechs, startups, and other 'smaller' orgs.
so if you want to do design, you might be competing for... lets say 3 lead designers per org on avg, 3 * 50 = 150 silicon design spots for the entire globe. to add, a resource in such scarce supply will no doubt be heavily guarded by its occupants.
i did this calculation back when i was still in uni. i'll never know if it paid off, or if it was even rooted in logic, but i remember thinking to myself back then: "no way in hell am i gonna let these old guys pidgeon hole me into doing monkey work with a promise of future design opportunities." arrogant, yes, but i can't say i regret my decision judging from the anecdotes i get from friends in the hardware world.
RealityVoid|11 days ago
And basically anyone who has a job in tech [1] or someone who just pulled their salary out of the ATM has enough money to do a tapeout with the cash in their hand [2] or chinese students for basically free[3]. Of course, for _some_ scopes of tapeout. These are older nodes and you have limited area. But you might not need anything fancy for your design.
The rest of the post, I think has a bunch of misunderstandings or wrong facts, but I don't work in the field, (ish) so I might be as clueless as you and I need to get back to my day job so I won't try countering you just yet.
[1] https://wafer.space/ [2] https://app.tinytapeout.com/calculator?tiles=1&pcbs=1 [3] https://ysyx.oscc.cc/docs/en/
webdevver|12 days ago
i think that, for digital design to be interesting, the cost of entry must be lowered by probably orders upon orders of magnitude.
the google skywaterpdk thing, whatever it is (or was?), did produce a great deal of hobbyist designs and proved that there really isn't anything special about rtl - infact, its really quite monotonous and boring.
which is a good attitude to have, really. lots of hobbyist designs got cranked out quickly on what, as i understood, was a very obsolete pdk from two decades ago.
but its fundamentally still too expensive and too limited. open source software 'blew up' because
1. the cost of entry was free...
2. ...for state of the art tools.
its not enough to be free, or open source. it also has to be competitive. llvm/gcc won the compiler world because they blew the codegen of proprietary compilers out of the water, ofcourse being open source it became a positive feedback loop of lots of expert eyeballs -> better compiler -> more experts look at it -> better compiler -> ...
for digital design to become interesting, you can't trick the kids: they want the same tech the 'big boys' are using. so, what scope is there to make it economical for someone like Intel carving out some space for a no-strings-attached digital design lottery?
i get the impression that, unlike for most manufacturing processes, the costs of silicon digital electronics increases every year, and the amortisation schedule becomes bigger, not smaller.
so if anything, it seems that the more high tech silicon manufacturing becomes, the smaller the pool of players (who have the ever-increasing capital expenditure necessary) becomes, which should indicate that the opportunities for digital design work are actually going to be shrinking as time goes on.
bee_rider|12 days ago
Hardware people go to software because it is lower-stress and can pay better (well, at least you have a higher chance of getting rich, start-ups and all that).
anthk|12 days ago
OhMeadhbh|10 days ago
(But I am not saying either task is easy.)
SomaticPirate|12 days ago
I used Vivado (from Xilinx) a bit during my undergrad in computer engineering and was constantly surprised at how much of a complete disaster the tooling chain was. Crashes that would erase all your work. Strange errors.
I briefed worked at a few hardware companies and I was always taken aback by the poor state of the tooling which was highly correlated with the license terms dicated by EDA tools. Software dev seemed much more interesting and portable. Working in hardware meant you would almost always be searching between Intel, Arm, AMD and maybe Nvidia if you were a rockstar.
Software by comparison offered plentiful opportunities and a skill set that could be used at an insurance firm or any of the fortune 100s. I've always loved hardware but the opaque datasheets and IP rules kills my interest everytime.
Also, I would argue software devs make better hardware engineers. Look at Oxide computer. They have fixed bugs in AMD's hardware datasets because of their insane attention to detail. Software has eaten the world and EEs should not be writing the software that brings up UEFI. We would have much more powerful hardware systems if we were able to shine a light on the inner workings of most hardware.
cfd456|11 days ago
unknown|11 days ago
[deleted]
unknown|12 days ago
[deleted]
tensility|11 days ago
dilawar|12 days ago
And CS folks should design hardwares because they understand concurrency better?!
rbanffy|12 days ago
lawstkawz|12 days ago
A whole lot of my coursework could be described as UML diagramming but using glyphs for resistors and ground.
Robots handle much of the assembly work these days. Most of the human work is jotting down arbitrary notation to represent a loop or when to cache state (use a capacitor).
Software engineers have come up with a whole lot of euphemistic notations for "store this value and transform it when these signals/events occur". It's more of a psychosis that long ago quit serving humanity and became a fetish for screen addicts.
Tade0|12 days ago
My courses didn't get into the details of semiconductor design (particularly manufacturing), but we had one on the physical principles behind this whole thing - bandgaps and all.
We also had to design analog circuits using the Ebers-Moll transistor model, so pretty basic, but still not exactly linear.
Overall these are very different fields but at the end of the day they both have models and systems, so you could make a student of one of them learn the other and vice versa.
It just has to be worth the effort.
unknown|12 days ago
[deleted]