The author likes to point out how developer career online advice, blogs, rants etc. aren't applicable to low level programming roles, with a patronising tone but they're right nonetheless. Although embedded people will say the same about system guys, TCS researchers about all types of SWEs and so on.
However they seem to be falling into the same pitfalls with the frontend rockstar programmer-influencers; omitting and even undermining the importance of formal computer science education.
Especially if we're talking about getting into non-web development/IT roles, it's quite a lofty goal to expect mastery in programming complex software without some structured curriculum that will help you obtain the fundamentals to do so. And no, Coursera unfortunately won't make the cut most of the time.
Get a degree in Comp. Sci/Comp. Eng or at least some STEM degree and then a Masters. I understand the US-centric view of tertiary education being a risk due to large capital investment but in the rest of the world where it's affordable/free, most would hesitate to hire self-taught developers, even in web development positions.
The degree may not be enough for systems programming competency and this is where advice in the article about open source contribution comes to play, but I'd say it's kind of necessary unless you're some bunniefoo type of brain or idk.
I'm about 20 years into my career as a self-taught software engineer. I mostly work with companies that only hire out of MIT, Cornell, Stanford, Waterloo, etc.
By and large my experience with these graduates is that their awareness of the field does not extend much beyond what they learned in school and what the current hype technology is (today it's ML). If you got enormously lucky, they worked or interned somewhere special and were able to expand their skillset.
Consistently, the absolute best developers I've worked with tend to be self taught and especially people with Computer Music degrees. It turns out that learning DSP is a skill that pays huge dividends elsewhere in this field. I've also worked with CompSci Ph.D.s that can't build their way out of a wet paper bag.
As far as the degree programs go though, Waterloo is excellent. I would even say above the rest. Even MIT. I am consistently impressed with their graduates that I've had the privilege to work with.
And as for my own journey, I have my weak spots. I also tend to read more papers and technical books than most of my peers by a significant margin. I think that's my edge.
I really disagree with your point of getting a degree.
Don't get me wrong, I don't personally think that a CS degree is useless (or an engineering one). I hold a masters in chemistry and if I could go back and make it a cs one it would've been better all things considered.
But I think the following:
- getting a degree says very little about how much a person learned or absorbed. So much depends both on the quality of teaching (often low even in ivy league colleges where you're taught by assistants, ugh) and the learner. I've met many engineers with a degree obtained cum laude and they just didn't have fundamentals. They would know everything the day of the exam and have no clue about the topics just few months later. They got the degree to find a job, not because they cared. On the other hand some of the most impressive devs I know where self taught. Even among core Linux contributors plenty do not have any degree.
- internet has nowadays an endless amount of resources, nothing can stop a motivated and disciplined person from getting good at non-natural sciences such as maths and cs.
> Coursera unfortunately won't make the cut most of the time.
This is a truly bizarre take... Coursera offers much of the same courses you'd get in a BCS or sometimes also a MCS degree.
Add to this that most colleges are downright awful, with hilariously incompetent professors / TAs... and virtually none really prepares students for working in programming industry, (nobody teaches version control, project management, infrastructure, editing tools, quality control). There's very little a programmer can get from going to college in terms of useful knowledge.
In my opinion, it's better to pick up a few books on some theoretical aspects couple years after you have enough of practical experience with programming in industrial setting. Coursera would help here. College would be hugely problematic due to being a huge time sink (inconvenient and rigid times for lectures, workshops etc.), hugely expensive, and, in most cases, outright lower quality.
Degree is good to show something to HR, especially if you have nothing else to show... other than that? -- not so much.
I think it's worth noting that some of the most celebrated programmers have been self-taught – think Aaron Swartz, John Carmack, George Hotz and so on. Among leading academics also, it's not unusual to find those without formal undergraduate degrees in CS. Many come from mathematics backgrounds, including several Turing Award recipients.
In the world of software development, degrees have often held less sway, and I think this trend will only become more pronounced. Too much focus on CS degrees I feel does not account for the inherent nature of the role - and I say this as a CS grad from a top school.
> in the rest of the world where it's affordable/free, most would hesitate to hire self-taught developers, even in web development positions.
I live in a country with free education and nobody cares about developer education, if said developer has some experience.
I, myself, dropped university in the last year (it was my mistake at that time, but whatever) and the only real issue it caused in my life is that I can't easily get work permission in Europe (and many other countries) if I would decide to immigrate there. And even then it's possible to compensate lack of degree with work experience, I just don't exactly know how it works, because most of my work so far is kind of "self-employed freelancer", so I have no idea how I would prove my work experience to the visa office.
That said, getting proper education is not a bad thing, it's just not very essential, at least in my experience.
> US-centric view of tertiary education being a risk due to large capital investment
I can't help but think European perception is a little exaggerated here. Yes, the live on campus and party at age 18/19 experience is expensive. But, getting a state school degree on a budget, and maybe a year of community college? You're looking at 10-30k.
The bigger risk is opportunity cost since degrees start later and take longer.
If you're not 18 and starting from scratch, you can learn these things without getting a degree. The overall point that I agree with is that you shouldn't neglect fundamental knowledge in the field regardless of whether you have formal CS education.
> And no, Coursera unfortunately won't make the cut most of the time.
Some of the best universities in the world have computer science materials online (on Coursera and youtube), I don't get how that's inadequate.
> most would hesitate to hire self-taught developers, even in web development positions.
Not true for experienced candidates. If you have no experience, you rely on your degree to get the first job; and if you don't have a degree, it can be difficult to bootstrap that process but the further you get in time from your education, the less relevant it is.
> Get a degree in Comp. Sci/Comp. Eng or at least some STEM degree and then a Masters. I understand the US-centric view of tertiary education being a risk due to large capital investment but in the rest of the world where it's affordable/free, most would hesitate to hire self-taught developers, even in web development positions.
I can agree that having a BSc would be an asset. Most employers understand Master's contribute almost nothing unless your particular niche is something in systems (compilers for example). However, systems advisors are few and far between these days. You're much more likely to find one of the millions of AI/ML MSc's. Maybe for a research position or something it doesnt matter as long as you have the paper. Even then, you could make an argument for systems programming that a BSc is STILL sufficient. My systems programming classes at the graduate level were basic bullshit. Nothing I couldn't have gleaned from reading either "Modern Operating Systems", The Dragon Book, Engineering a Compiler, or any number of the MPI books.
Lastly, I have the requisite credentials. I work side by side with self taught engineers. Very few companies will not hire a non-degreed person if they can show good work. The idea of "work in open source" is dumb and lame. If that's your cup of tea - great! 99.9% of people don't have enough passion about a single tool, and a single code base, to slog through it to get "noticed". Better to implement your own toy stuff and just talk about it online if you really care. The author's 10x cyber web developer ninja nonsense is a very 2000's way of thinking about "getting noticed".
My best advice to people is usually to get a BSc so you have a strong foundation in Math and Computer Science taught in a structured, standard way. I also advise people to not spend more than ~$20,000 or so getting it. At that point you're usually young enough that the opportunity/time cost is extremely low and ~$20,000 can be paid off pretty quickly even if it blows up in your face. ABET accreditation is an amazing thing. Aside from the "prestige" you're learning the same stuff MIT is. Personally, I paid all-in about $27,000. Worth every penny at that cost.
Everything else, like all things in life, can be boiled down to "just do it". I also work with a lot of MScs. There's a reason they report up to me and it's not because I have more "pedigree". Study theory all you want. At the end of the day an application/idea/paper/etc implemented is more important than waxing poetic about theory.
My only advice for people starting out their career is to make sure you are surrounded by really good people.
All the good things that have happened in my career have been because the people around me have been amazingly good.
Find those people, and stick with them - they will take you to places that cannot reach by yourself.
This also means you need to be friends with people and maintain those friendships.
Also, if you ever find yourself feeling like the "smartest in the room" it's time to start worrying. You are not going to learn anything in that situation unless you are a very special kind of person.
> My only advice for people starting out their career is to make sure you are surrounded by really good people.
I keep hearing this, but I have a hard time actually finding such people. There unfortunately aren't events like hackathons around where I live, so although I'm sure there are cool (in a professional context) people around me, I'm never able to figure out how to find them. I have an amazing set of friends but they're entirely into the arts so coming by tech work I can do through them is rare.
> Also, if you ever find yourself feeling like the "smartest in the room" it's time to start worrying. You are not going to learn anything in that situation unless you are a very special kind of person.
I typically sit in an empty room, making me the smartest in the room by default. So im curious to know what you mean by "a very special kind of person", maybe itll help me learn better.
So this authors example of an ideal young systems programmer looking to get hired was one that provided (free!) significant contributions to their Golang client, their database and _an academic paper_, presumably over the course of many weeks (months?).
I don't disagree with the advice of contributing to open source, but I hope they realize that they are essentially telling readers that contributing unknown amounts of hard, free labor is the best path to getting a job in this subfield of tech.
(Meanwhile, the kid who went to Stanford will just apply and probably get an offer if their DS&A chops are fresh enough.)
(Also, I'm guessing your contributions don't count at some places if your PR/MRs don't get merged, which is the other gnarly thing about open source that complicates this advice. Big open-source projects tend to have big politics behind them, even if they are all entirely online, like working groups, the release process, general disagreements (which will happen; every contributor is smart, and smart people love friendly debate!), etc.)
> So this authors example of an ideal young systems programmer looking to get hired was one that provided (free!) significant contributions to their Golang client, their database and _an academic paper_, presumably over the course of many weeks (months?).
That's not how I see it! I love databases and open source; libsql seemed right for me. I had no idea (or expectations) I would get hired by doing so. By contributing to open source, I am getting to work on some complex codebases where some brilliant people are helping me with my code in pull requests. Isn’t that awesome?
Sure, it may only work for some, and this isn’t the only way to systems programming. It is totally okay if someone wants to take a different path.
Instead of focusing on what’s wrong, why not use this opportunity to help and guide others in a similar boat?
To rephrase, how would you advise someone like me to change the domain from the backend to systems programming? For someone who doesn’t have experience with systems programming, knowing only Python, and Go professionally. During the (previous) job, I was working on scaling micro services and building web backend. There was zero room for systems related stuff.
This is like saying "if you have a lot of money to throw at the problem, you will likely solve it". The kid who went to Stanford could afford to go there in the first place. A lot of things in life are different for kids like this one, hiring notwithstanding. That kind of "kid" probably won't need to go through the same job application processes as someone who could only afford a community college.
Also... comparing time and money investment you'd have to make to get into Stanford to investment you'd have to make to contribute to an opensource project is orders of magnitude not in favor of Stanford.
Also, as someone who actually had to hire for system programming jobs -- I wouldn't care about candidate's degree and would be put off if the candidate tried to force the subject. In my experience, there's very little overlap between system programming and college curriculum. So, I wouldn't be so sure the "Stanford kid" is getting a job. College education helps you get into "programmer mills" companies like the big names with large internship / junior education programs. Smaller companies who don't have the capacity to provide on-the-job education, especially in the face of low retention factor wouldn't be interested in hiring Ivy league graduates if graduation from an Ivy league university was the only thing going on for them.
Feels like the trend lately for a lot of white collar jobs is overt gatekeeping. Don’t know if it’s a natural reaction to economic uncertainty or what, but the gatekeepers always seem to insist that everyone should be 100% committed to their job in their off hours.
Feels like the no-lifers tend to run the show eventually. And I speak as someone who did exactly that in my 20s, and sometimes regret it.
I am against contributing to open source just to show chops especially if it means contributing to $corp’s repo to try and impress
$corp into giving a job. Just contribute (if you want to) to
what is fun or gets you curious. As for bang for buck: more applications, more interview question prep > FOSS contributions from a standing start. If you invented clojure or something then that is something else. But you probably didn’t.
Incoming hot take: Don't become a professional systems developer. Do it as a hobby.
If the goal is to optimize for $$$ earned over one's career, there is more money in being a back-end/distributed apps developer. There are simply more roles (at the higher levels) and companies working on these problems.
The progression for SW engineers is junior->senior->staff->principal->distinguished. As you get closer to staff, there are very few systems roles left. Some people never make it beyond staff/L6/E6. (One could argue that it's ok to stay at staff for the rest of your career and still make good money. But there's always more to be made.) And the ones that do, are the ones that spearhead new products/features (direct business impact).
My 2cents, after 15 years as a system dev in the valley.
> Incoming hot take: Don't become a professional systems developer. Do it as a hobby. If the goal is to optimize for $$$ earned over one's career
Deep systems programmers get paid a lot. I'm sure there are other jobs that pay more, but if you make enough and like what you do, you should do what you like.
Also higher up the abstraction chain you go, the more froth their it (I'm mainly thinking of frameworks, but this is true more generally) so you have to pay a lot of attention to what's going on. While the deeper you go down the stack the slower the slewing rate and the more time you have to understand the domain more deeply. Which, to go back to your "$$$" position, acts as a moat.
I suppose this ultimately stems from that fact that if a program is ugly, everyone hates it, but if a program is insecure nobody knows, and if they find out they don't care.
The advice boils down to: Contribute to open source and hope to get noticed.
I was expecting something more actionable tbh, but it does make sense. It is a bit disappointing to read that systems programming jobs are basically out of reach unless you're willing to put in a bunch of free work.
That may be the advice, but what I actually saw was: "Be really, really good at what you do and someone will notice."
That isn't possible for most people. They simply don't have the innate skills and drives that that person does. That person was always going to get a job because their passion shows through, and they do things.
It doesn't matter if it's open source or not. Having a code portfolio when you apply, even if it's a private portfolio, would always have shown their skills.
They aren’t. The advice in the article isn’t bad, but it’s not the only way. An alternate path I would suggest is get hired by a large corporation who employs systems programmers, straight out of college. Straight into a systems role or not.
> unless you're willing to put in a bunch of free work.
People pay ungodly sums to go to school. And even in countries with "free" education there is an opportunity cost.
Let's not even get started on those that want/need to contribute to academia either as a stepping stone or a career.
From the story I got the V was looking to “switch jobs”.
From personal experience, I once had to work with an open source library at work. I got very familiar with the inner workings to the point that it wouldn’t have been to hard for me to contribute, and I did have a few improvements in mind. I never did that though.
I thought I would give medium one more chance, thinking that maybe it has gotten better, it hasn’t.
The author starts off with a self aggrandizing tweet from themself:
> We kernel and database people think this discussion about whether frontend or backend is harder is kinda cute. you're all adorable!
As someone that has a full stack web development job that also enjoys systems programming on the side, both are difficult in different ways and the only reason you could come to the conclusion that one is “harder” (whatever that even means) than the other is from a stance of ignorance. You literally do not know what you do not know.
Than, the author’s only career advice is to contribute to open source. That’s great, not exactly earth shattering but not really worth the long winded article to get to that point.
If the author is here, consider not starting your article with a self quoted rage-bait Tweet that adds nothing of value to the article. Maybe your intention is to capture engagement, but it leaves any reader that may have been interested in what you had to say and is a web developer disenfranchised.
It’s very easy to dismiss an entire field of technology that you don’t work in without ever attempting it. It’s different to actually dive in and try to understand why there are problems and what those problems are.
The author's intent here is for young engineers to build signal and reputation. Open source contribution is one potential way to go about that. Another comment here referenced the importance of a CS degree. There are other ways to go about accomplishing ways of signaling to gain reputation.
As general advice for those early in their career, I find this to be inadequate. These tactics (no offense to the CS peeps) might "get you noticed", but that won't sustain if you don't back it up. A github account or a cert from your favorite uni might potentially get you in the door somewhere, but you will have built expectations about what people can expect from you once you're in the fold.
Very few have ever kept a job because of their G/H reputation or GPA. The number of times I've seen a hire who was "noticed" that failed in their new role is almost meme-level. It's amazing how quickly credibility and reputation from that get-you-noticed effort can be burned down.
By all means, make logical efforts to get noticed, but that will only take you so far. You will significantly improve that signal/rep when you translate your previous activities to skills, learning, and value as you go forward.
ok, I can see how the tweet on the top of the article can have that effect. In my defence, this was a humorous tweet that happened in a week where everybody on twitter, frontend and backend, were dunking on each other. So I wrote that in a humorous way. Lots of people liked it and it was funny, so I decided to open the article with it.
However taking a step back, if you don't have the context around the tweet, it can certainly be seen this way.
Since it was not my intention to have that tone, I will remove the tweet from the article.
I did some consulting recently for a startup on the data observability space. Performance and scalability were terrible. Not to mention other subtle issues like terrible coupling between services, between classes, terrible abstractions, the inconsistency of the naming in the domain model (e.g. an object that had one name in a set of micro-services, but was named completely differently in the UI and front-end code.
And in general, the culture of the company was about churning out features fast,
at the cost of anything else.
Not surprisingly, as customers started to scale up their deployments, the problems started arising. And yet, the CTO refused to even acknowledge my advice, my carefully done benchmarks, my numeric projections. It was completely alien for him. He didn't understand that system software requires a more careful approach than your usual B2C e-commerce MVP.
I was "lucky" to get into system programming on the cusp of Go introduction. I was a refugee from ActionScript world, and was looking for something else to do with my rudimentary programming skills, mostly looking for Python jobs, especially outside Web because going from ActionScript to JavaScript was way too depressing.
So, I applied to a job posting that was looking for "Python programmers who want to learn a different language" (I think, at first at least, Go advertised itself as "almost like Python"). And then I've got a job in the automation department of the company, and from there it was more predictable and sure way forward.
After a while I also had to interview people for positions in system. Obviously, I'd be overcome with joy if I ever had a candidate who knew what our product was and was contributing to the open-source parts of it... but nothing like this had ever happened.
In most cases, people who came to interviews for system jobs weren't themselves system programmers. So, my task was to look for the best match among candidates who didn't generally match at all. It's ironic how much a typical programmer is completely unaware of how their computers function -- so, I was generally fishing for the bits of general knowledge about computers, and if a candidate could master a plausible explanation of how a particular computer component or a process could work, that was already a huge plus.
Now, how can you possibly acquire this knowledge? -- Really, I cannot think of a better way than through contributing to an open-source project, if you aren't already hired by a company that employs you as a system programmer... College doesn't teach anything relevant. Even online boot camps rarely have anything relevant. There are some books, but, mostly woefully outdated and they don't generalize well. The knowledge mostly comes from first-hand experience and following various mailing lists, bug truckers and every so often a conference.
I don't know if my experience is the norm or wildly different, but when I started out as an electrical/automation engineer, hiring low-level programmers was way closer to hiring "traditional" electrical/electronics engineers, than the modern leetcode SWE ordeal.
Interesting to see this here... I unexpectedly transitioned from full-stack development to paid open source systems programming a bit more than a year ago. It would have been quite more difficult without my contributions to the Rust compiler and ecosystem back when I was at the university, and it still feels like I got very lucky.
The author has co-opted the term 'Systems programmer'.
In the original Enterprise computing environment (big iron), a 'System programmer' is one that maintains the operating system, transaction processor (CICS), network stack, etc.
[+] [-] antegamisou|2 years ago|reply
However they seem to be falling into the same pitfalls with the frontend rockstar programmer-influencers; omitting and even undermining the importance of formal computer science education.
Especially if we're talking about getting into non-web development/IT roles, it's quite a lofty goal to expect mastery in programming complex software without some structured curriculum that will help you obtain the fundamentals to do so. And no, Coursera unfortunately won't make the cut most of the time.
Get a degree in Comp. Sci/Comp. Eng or at least some STEM degree and then a Masters. I understand the US-centric view of tertiary education being a risk due to large capital investment but in the rest of the world where it's affordable/free, most would hesitate to hire self-taught developers, even in web development positions.
The degree may not be enough for systems programming competency and this is where advice in the article about open source contribution comes to play, but I'd say it's kind of necessary unless you're some bunniefoo type of brain or idk.
[+] [-] busterarm|2 years ago|reply
By and large my experience with these graduates is that their awareness of the field does not extend much beyond what they learned in school and what the current hype technology is (today it's ML). If you got enormously lucky, they worked or interned somewhere special and were able to expand their skillset.
Consistently, the absolute best developers I've worked with tend to be self taught and especially people with Computer Music degrees. It turns out that learning DSP is a skill that pays huge dividends elsewhere in this field. I've also worked with CompSci Ph.D.s that can't build their way out of a wet paper bag.
As far as the degree programs go though, Waterloo is excellent. I would even say above the rest. Even MIT. I am consistently impressed with their graduates that I've had the privilege to work with.
And as for my own journey, I have my weak spots. I also tend to read more papers and technical books than most of my peers by a significant margin. I think that's my edge.
[+] [-] epolanski|2 years ago|reply
Don't get me wrong, I don't personally think that a CS degree is useless (or an engineering one). I hold a masters in chemistry and if I could go back and make it a cs one it would've been better all things considered.
But I think the following:
- getting a degree says very little about how much a person learned or absorbed. So much depends both on the quality of teaching (often low even in ivy league colleges where you're taught by assistants, ugh) and the learner. I've met many engineers with a degree obtained cum laude and they just didn't have fundamentals. They would know everything the day of the exam and have no clue about the topics just few months later. They got the degree to find a job, not because they cared. On the other hand some of the most impressive devs I know where self taught. Even among core Linux contributors plenty do not have any degree.
- internet has nowadays an endless amount of resources, nothing can stop a motivated and disciplined person from getting good at non-natural sciences such as maths and cs.
[+] [-] crabbone|2 years ago|reply
This is a truly bizarre take... Coursera offers much of the same courses you'd get in a BCS or sometimes also a MCS degree.
Add to this that most colleges are downright awful, with hilariously incompetent professors / TAs... and virtually none really prepares students for working in programming industry, (nobody teaches version control, project management, infrastructure, editing tools, quality control). There's very little a programmer can get from going to college in terms of useful knowledge.
In my opinion, it's better to pick up a few books on some theoretical aspects couple years after you have enough of practical experience with programming in industrial setting. Coursera would help here. College would be hugely problematic due to being a huge time sink (inconvenient and rigid times for lectures, workshops etc.), hugely expensive, and, in most cases, outright lower quality.
Degree is good to show something to HR, especially if you have nothing else to show... other than that? -- not so much.
[+] [-] kubrickslair|2 years ago|reply
In the world of software development, degrees have often held less sway, and I think this trend will only become more pronounced. Too much focus on CS degrees I feel does not account for the inherent nature of the role - and I say this as a CS grad from a top school.
[+] [-] vbezhenar|2 years ago|reply
I live in a country with free education and nobody cares about developer education, if said developer has some experience.
I, myself, dropped university in the last year (it was my mistake at that time, but whatever) and the only real issue it caused in my life is that I can't easily get work permission in Europe (and many other countries) if I would decide to immigrate there. And even then it's possible to compensate lack of degree with work experience, I just don't exactly know how it works, because most of my work so far is kind of "self-employed freelancer", so I have no idea how I would prove my work experience to the visa office.
That said, getting proper education is not a bad thing, it's just not very essential, at least in my experience.
[+] [-] commonlisp94|2 years ago|reply
I can't help but think European perception is a little exaggerated here. Yes, the live on campus and party at age 18/19 experience is expensive. But, getting a state school degree on a budget, and maybe a year of community college? You're looking at 10-30k.
The bigger risk is opportunity cost since degrees start later and take longer.
[+] [-] jstx1|2 years ago|reply
> And no, Coursera unfortunately won't make the cut most of the time.
Some of the best universities in the world have computer science materials online (on Coursera and youtube), I don't get how that's inadequate.
> most would hesitate to hire self-taught developers, even in web development positions.
Not true for experienced candidates. If you have no experience, you rely on your degree to get the first job; and if you don't have a degree, it can be difficult to bootstrap that process but the further you get in time from your education, the less relevant it is.
[+] [-] glommer|2 years ago|reply
it's very important!
[+] [-] zer8k|2 years ago|reply
I can agree that having a BSc would be an asset. Most employers understand Master's contribute almost nothing unless your particular niche is something in systems (compilers for example). However, systems advisors are few and far between these days. You're much more likely to find one of the millions of AI/ML MSc's. Maybe for a research position or something it doesnt matter as long as you have the paper. Even then, you could make an argument for systems programming that a BSc is STILL sufficient. My systems programming classes at the graduate level were basic bullshit. Nothing I couldn't have gleaned from reading either "Modern Operating Systems", The Dragon Book, Engineering a Compiler, or any number of the MPI books.
Lastly, I have the requisite credentials. I work side by side with self taught engineers. Very few companies will not hire a non-degreed person if they can show good work. The idea of "work in open source" is dumb and lame. If that's your cup of tea - great! 99.9% of people don't have enough passion about a single tool, and a single code base, to slog through it to get "noticed". Better to implement your own toy stuff and just talk about it online if you really care. The author's 10x cyber web developer ninja nonsense is a very 2000's way of thinking about "getting noticed".
My best advice to people is usually to get a BSc so you have a strong foundation in Math and Computer Science taught in a structured, standard way. I also advise people to not spend more than ~$20,000 or so getting it. At that point you're usually young enough that the opportunity/time cost is extremely low and ~$20,000 can be paid off pretty quickly even if it blows up in your face. ABET accreditation is an amazing thing. Aside from the "prestige" you're learning the same stuff MIT is. Personally, I paid all-in about $27,000. Worth every penny at that cost.
Everything else, like all things in life, can be boiled down to "just do it". I also work with a lot of MScs. There's a reason they report up to me and it's not because I have more "pedigree". Study theory all you want. At the end of the day an application/idea/paper/etc implemented is more important than waxing poetic about theory.
[+] [-] iamflimflam1|2 years ago|reply
All the good things that have happened in my career have been because the people around me have been amazingly good.
Find those people, and stick with them - they will take you to places that cannot reach by yourself.
This also means you need to be friends with people and maintain those friendships.
Also, if you ever find yourself feeling like the "smartest in the room" it's time to start worrying. You are not going to learn anything in that situation unless you are a very special kind of person.
[+] [-] notRobot|2 years ago|reply
I keep hearing this, but I have a hard time actually finding such people. There unfortunately aren't events like hackathons around where I live, so although I'm sure there are cool (in a professional context) people around me, I'm never able to figure out how to find them. I have an amazing set of friends but they're entirely into the arts so coming by tech work I can do through them is rare.
[+] [-] iamflimflam1|2 years ago|reply
[+] [-] booboofixer|2 years ago|reply
I typically sit in an empty room, making me the smartest in the room by default. So im curious to know what you mean by "a very special kind of person", maybe itll help me learn better.
[+] [-] nunez|2 years ago|reply
I don't disagree with the advice of contributing to open source, but I hope they realize that they are essentially telling readers that contributing unknown amounts of hard, free labor is the best path to getting a job in this subfield of tech.
(Meanwhile, the kid who went to Stanford will just apply and probably get an offer if their DS&A chops are fresh enough.)
(Also, I'm guessing your contributions don't count at some places if your PR/MRs don't get merged, which is the other gnarly thing about open source that complicates this advice. Big open-source projects tend to have big politics behind them, even if they are all entirely online, like working groups, the release process, general disagreements (which will happen; every contributor is smart, and smart people love friendly debate!), etc.)
[+] [-] avinassh|2 years ago|reply
> So this authors example of an ideal young systems programmer looking to get hired was one that provided (free!) significant contributions to their Golang client, their database and _an academic paper_, presumably over the course of many weeks (months?).
That's not how I see it! I love databases and open source; libsql seemed right for me. I had no idea (or expectations) I would get hired by doing so. By contributing to open source, I am getting to work on some complex codebases where some brilliant people are helping me with my code in pull requests. Isn’t that awesome?
Sure, it may only work for some, and this isn’t the only way to systems programming. It is totally okay if someone wants to take a different path.
Instead of focusing on what’s wrong, why not use this opportunity to help and guide others in a similar boat?
To rephrase, how would you advise someone like me to change the domain from the backend to systems programming? For someone who doesn’t have experience with systems programming, knowing only Python, and Go professionally. During the (previous) job, I was working on scaling micro services and building web backend. There was zero room for systems related stuff.
[0] - https://github.com/avinassh
[+] [-] crabbone|2 years ago|reply
Also... comparing time and money investment you'd have to make to get into Stanford to investment you'd have to make to contribute to an opensource project is orders of magnitude not in favor of Stanford.
Also, as someone who actually had to hire for system programming jobs -- I wouldn't care about candidate's degree and would be put off if the candidate tried to force the subject. In my experience, there's very little overlap between system programming and college curriculum. So, I wouldn't be so sure the "Stanford kid" is getting a job. College education helps you get into "programmer mills" companies like the big names with large internship / junior education programs. Smaller companies who don't have the capacity to provide on-the-job education, especially in the face of low retention factor wouldn't be interested in hiring Ivy league graduates if graduation from an Ivy league university was the only thing going on for them.
[+] [-] mattgreenrocks|2 years ago|reply
Feels like the no-lifers tend to run the show eventually. And I speak as someone who did exactly that in my 20s, and sometimes regret it.
[+] [-] quickthrower2|2 years ago|reply
[+] [-] MichaelZuo|2 years ago|reply
That's pretty much a tautology for a selective anything.
[+] [-] spinlock_t|2 years ago|reply
If the goal is to optimize for $$$ earned over one's career, there is more money in being a back-end/distributed apps developer. There are simply more roles (at the higher levels) and companies working on these problems. The progression for SW engineers is junior->senior->staff->principal->distinguished. As you get closer to staff, there are very few systems roles left. Some people never make it beyond staff/L6/E6. (One could argue that it's ok to stay at staff for the rest of your career and still make good money. But there's always more to be made.) And the ones that do, are the ones that spearhead new products/features (direct business impact). My 2cents, after 15 years as a system dev in the valley.
[+] [-] gumby|2 years ago|reply
Deep systems programmers get paid a lot. I'm sure there are other jobs that pay more, but if you make enough and like what you do, you should do what you like.
Also higher up the abstraction chain you go, the more froth their it (I'm mainly thinking of frameworks, but this is true more generally) so you have to pay a lot of attention to what's going on. While the deeper you go down the stack the slower the slewing rate and the more time you have to understand the domain more deeply. Which, to go back to your "$$$" position, acts as a moat.
[+] [-] Buttons840|2 years ago|reply
[+] [-] cinntaile|2 years ago|reply
I was expecting something more actionable tbh, but it does make sense. It is a bit disappointing to read that systems programming jobs are basically out of reach unless you're willing to put in a bunch of free work.
[+] [-] wccrawford|2 years ago|reply
That isn't possible for most people. They simply don't have the innate skills and drives that that person does. That person was always going to get a job because their passion shows through, and they do things.
It doesn't matter if it's open source or not. Having a code portfolio when you apply, even if it's a private portfolio, would always have shown their skills.
[+] [-] uxp100|2 years ago|reply
[+] [-] epcoa|2 years ago|reply
People pay ungodly sums to go to school. And even in countries with "free" education there is an opportunity cost. Let's not even get started on those that want/need to contribute to academia either as a stepping stone or a career.
TANSTAAFL.
[+] [-] _benj|2 years ago|reply
From the story I got the V was looking to “switch jobs”.
From personal experience, I once had to work with an open source library at work. I got very familiar with the inner workings to the point that it wouldn’t have been to hard for me to contribute, and I did have a few improvements in mind. I never did that though.
[+] [-] _gabe_|2 years ago|reply
The author starts off with a self aggrandizing tweet from themself:
> We kernel and database people think this discussion about whether frontend or backend is harder is kinda cute. you're all adorable!
As someone that has a full stack web development job that also enjoys systems programming on the side, both are difficult in different ways and the only reason you could come to the conclusion that one is “harder” (whatever that even means) than the other is from a stance of ignorance. You literally do not know what you do not know.
Than, the author’s only career advice is to contribute to open source. That’s great, not exactly earth shattering but not really worth the long winded article to get to that point.
If the author is here, consider not starting your article with a self quoted rage-bait Tweet that adds nothing of value to the article. Maybe your intention is to capture engagement, but it leaves any reader that may have been interested in what you had to say and is a web developer disenfranchised.
It’s very easy to dismiss an entire field of technology that you don’t work in without ever attempting it. It’s different to actually dive in and try to understand why there are problems and what those problems are.
[+] [-] tacker2000|2 years ago|reply
What’s hard, what’s easy? Who knows? Who cares? Are you a better engineer for being a systems guy? Come on, this is just such a petty mindset.
[+] [-] jroseattle|2 years ago|reply
As general advice for those early in their career, I find this to be inadequate. These tactics (no offense to the CS peeps) might "get you noticed", but that won't sustain if you don't back it up. A github account or a cert from your favorite uni might potentially get you in the door somewhere, but you will have built expectations about what people can expect from you once you're in the fold.
Very few have ever kept a job because of their G/H reputation or GPA. The number of times I've seen a hire who was "noticed" that failed in their new role is almost meme-level. It's amazing how quickly credibility and reputation from that get-you-noticed effort can be burned down.
By all means, make logical efforts to get noticed, but that will only take you so far. You will significantly improve that signal/rep when you translate your previous activities to skills, learning, and value as you go forward.
[+] [-] _aaed|2 years ago|reply
[+] [-] glommer|2 years ago|reply
Back in our ScyllaDB days, we hired a people from both Linux and LLVM. It's a great way to stand out.
[+] [-] skowalak|2 years ago|reply
[+] [-] bibabaloo|2 years ago|reply
[+] [-] glommer|2 years ago|reply
However taking a step back, if you don't have the context around the tweet, it can certainly be seen this way.
Since it was not my intention to have that tone, I will remove the tweet from the article.
[+] [-] glommer|2 years ago|reply
[+] [-] elzbardico|2 years ago|reply
And in general, the culture of the company was about churning out features fast, at the cost of anything else.
Not surprisingly, as customers started to scale up their deployments, the problems started arising. And yet, the CTO refused to even acknowledge my advice, my carefully done benchmarks, my numeric projections. It was completely alien for him. He didn't understand that system software requires a more careful approach than your usual B2C e-commerce MVP.
[+] [-] crabbone|2 years ago|reply
So, I applied to a job posting that was looking for "Python programmers who want to learn a different language" (I think, at first at least, Go advertised itself as "almost like Python"). And then I've got a job in the automation department of the company, and from there it was more predictable and sure way forward.
After a while I also had to interview people for positions in system. Obviously, I'd be overcome with joy if I ever had a candidate who knew what our product was and was contributing to the open-source parts of it... but nothing like this had ever happened.
In most cases, people who came to interviews for system jobs weren't themselves system programmers. So, my task was to look for the best match among candidates who didn't generally match at all. It's ironic how much a typical programmer is completely unaware of how their computers function -- so, I was generally fishing for the bits of general knowledge about computers, and if a candidate could master a plausible explanation of how a particular computer component or a process could work, that was already a huge plus.
Now, how can you possibly acquire this knowledge? -- Really, I cannot think of a better way than through contributing to an open-source project, if you aren't already hired by a company that employs you as a system programmer... College doesn't teach anything relevant. Even online boot camps rarely have anything relevant. There are some books, but, mostly woefully outdated and they don't generalize well. The knowledge mostly comes from first-hand experience and following various mailing lists, bug truckers and every so often a conference.
[+] [-] TrackerFF|2 years ago|reply
[+] [-] artemonster|2 years ago|reply
[+] [-] tester756|2 years ago|reply
Low level programming salaries are terrible in compare to web-oriented.
I've literally witnessed people leaving semiconductor (so kernel development, compilers, firmware, etc, etc) companies for web dev because of $$
[+] [-] Palomides|2 years ago|reply
[+] [-] Joel_Mckay|2 years ago|reply
This advice may sound flippant, but it is actually quite relevant... =)
[+] [-] wofo|2 years ago|reply
[+] [-] tester756|2 years ago|reply
I wish we could get rid of that dependency and move to some better technology.
There aren't many compilers-related jobs that don't rely on C++, sadly.
Rust is promising as hell here.
[+] [-] RickJWagner|2 years ago|reply
In the original Enterprise computing environment (big iron), a 'System programmer' is one that maintains the operating system, transaction processor (CICS), network stack, etc.
[+] [-] raydiatian|2 years ago|reply
[+] [-] unknown|2 years ago|reply
[deleted]
[+] [-] unknown|2 years ago|reply
[deleted]