When I was 7 years old, the Big Scary Thing In The Future was cursive writing. Once you hit 2nd grade, teachers were going to start tearing up your homework unless it was written in cursive.
The next year, it was "3rd grade" when teachers would start tearing up non-cursive papers, then 4th, 5th, 6th and "definitely in junior high". But it never happened in reality.
In my 20s, the Big Scary Thing In The Future was ageism in tech. Once you hit 30, you'd never find work again in this industry.
Then it was 35. Then 40.
I've given up listening. Every year, in addition to getting better at what I do, I find that more people want to pay me more money to come program computers for them.
Now it's certainly possible that the real number is 45, and you'll find me living in a cardboard box and begging for nickels at the off-ramp in a few years. But at this point I'm not overly worried about this particular myth.
The key to longevity in engineering is:
1) develop an expertise;
2) develop management skills.
I know lots of guys working as engineers in their 50's. People hire them, because they have an expertise. Not a niche just by virtue of being old (like COBOL) but a problem domain they know inside and out. Nobody is tossing out their 45 year old engineer that works on carrier grade network software in favor of a fresh graduate.
So far my experience has been exactly the same as what you describe. I just turned 40 and recently got the highest paying job I've ever had - writing code full time. I'm cautiously optimistic we can keep this up indefinitely.
I seem to have accelerated my own aging in the sense that my technical prowess and cynicism have grown rapidly in the past 7 years. At 23, I'd never written a professional program in my life. Now, almost 30, I'm probably a 1.8-level engineer. So that's 0.9-1.1 points in 7 years, covering 6 jobs including a failed startup and explosive software failures (none my fault) seen up front and from afar.
My observation is that, as you get older, the jobs available to you get better but they also get rarer. That's partly because you filter out the bad jobs. When someone makes you sign a full-on non-compete to take a coding test-- not just an NDA covering the material in the test, but the works-- you just don't return the email. Anyway, what it means is that when you do get a job, you're more likely to find quality, but you can no longer count on a new job in 2 weeks if your existing one ends, especially because after 35+ you are going to be sized up for some kind of leadership potential, making fit demands very high.
Can you imagine Bill Gates, in college or no college debate, writing about quitting college, listing his milestones and then fast-forward to $60+ billion in net-worth.
Here is where this awful article's bait-and-switch happens:
> Brown and Linden’s analysis of Bureau of Labor Statistics and Census data for the semiconductor industry revealed that although salaries increased dramatically for engineers in their 30s, these increases slowed after the age of 40. After 50, the mean salary fell by 17% for those with bachelors degrees and by 14% for those with masters degrees and Ph.Ds. And salary increases for holders of postgraduate degrees were always lower than for those with bachelor’s degrees (in other words, even Ph.D degrees didn’t provide long-term job protection).
> It’s the same in the software industry. Prominent Silicon Valley investors often talk about youth being an advantage in entrepreneurship. If you look at their investment portfolios, all you see are engineers who are hardly old enough to shave. They rarely invest in people who are old.
The first paragraph, which contains the data that gives the veneer of respectability, is about the semiconductor industry. Even then, salaries don't actually decrease until people hit the beginning of retirement age (surprise!).
In the second paragraph, we switch to the software industry, where it's "the same" (no data to support that, natch). The supporting anecdote isn't even about employees, but investees... What proportion of people receiving money in the software industry do so via investment rather than a paycheck?
Of course the specter of ageism haunts everyone, so the linkbait is effective and we have 60+ comments here.
Now here's the key claim: "increased dramatically for engineers during their 30s but that these increases slowed after the age of 40. At greater ages still, salaries started dropping, dependent on the level of education. After 50, the mean salary of engineers was lower—by 17% for those with bachelors degrees, and by 14% for those with masters degrees and PhDs—than the salary of those younger than 50. "
Two points:
1. Would we expect a linear increase in salary with age? I would not. At some point the salary would likely 'top out' in any job.
2. The part about over 50s ignores a very important fact from the study: the over 50s work less (and therefore earn less). To quote the book: "Workers over the age of 50 are much more likely to work less than a full year [...]. One in six engineers aged 51 to 65 reported being paid for less than a full year of work in 2005 [...]". The book goes on to make a confusing claim about whether that's voluntary or not (jumping to the 2002 downturn and talking about interviews with engineers but with no data).
I think that the real anomaly is that many people really expect salaries to always go up with age. Which, especially now that people work well in their sixties, doesn't make much sense, except for some very rare exceptions.
"The young understand new technologies better than the old do"
No they don't. The young, smart developers who already have strong backgrounds adopt new technologies, the old, smart developers do the same.
"The young can easily pull all-nighters."
Sure they can, and companies should be moving away from that as the code quality does drop off past hour 12 unless they're a super talented developer. In which case they're probably smart enough to know not to work an all nighter.
If you're not willing to pay $150k for a great developer that's going to get shit done, you're fucked already.
companies should be moving away from that as the code quality does drop off past hour 12 unless they're a super talented developer.
s/12/7/g
s/unless they're a super talented developer//g
Dirty secret about coding. You're typically at your best doing 3-5 hours per day of actual programming. If you're doing 6-8 hours, split that with a workout. The other working time should be spent on design, exchange of ideas, and learning new stuff.
If you use a high-productivity language like Clojure or Haskell and work on a green field, you'll find that you can't program for 12 hours straight, because there's no fat in the development process. Your brain starts to hurt after 8-9 hours. Personally, I can get useful work done for about 11 hours per day (I seem to average 65-70 hours per week, including writing which has taken a lot of my time recently, no matter what my mode of employment) but there is no way I'd be able to write code for 77 hours per week. Maybe 55 if I really had to push.
If you're not willing to pay $150k for a great developer that's going to get shit done, you're fucked already.
My first programming job paid just over half of that (well, with a sizeable bonus, but 3x higher than the "top of range") and I got shit done. Cost-of-living is also a factor. I would probably build in Austin or Boston (it annoys me that those cities' names rhyme because they happen to be the top 2 candidates for the 2018 tech hub, as I see it) where the brains-to-dollars ratio is more favorable than New York's. (New York has plenty of brains; the dollars are a problem. Fucking rent.)
Precisely. I am 55 and very current. Many of my younger friends ask me about new technology. I'm actually past having kids at home, so tend to work pretty long hours...because I like what I do.
I have a lot of entrepreneurial experience and am a pretty good general business manager. Coupled with the ability to code, I find that I am in fairly great demand.
Where "$150k" depends entirely on location.... It's not necessary to spend nearly that much unless you're in one of very few tech hot spots that also has a very high CoL.
My wife and I were talking about this a couple weeks ago. We're both developers, and have both been working for 12 years at a variety of companies, large and small, yet neither of us has ever had a colleague who retired.
This article confirms what I suspected, older programmers don't retire, they just never get rehired after the latest round of layoffs.
This is another in a seemingly endless stream of "old people can't cut it in the coding world" articles. My intent isn't to trash the author or the piece, it's just that we've been over this ground and I am going to be brief.
The truth, as always, is nuanced. As the author says, it's up or out. If you're 50 and expect to do be doing the same type of work you did when you're 25, you're mistaken. As programmers we have to constantly be adapting.
The problem here is getting into any kind of attitude that says that you can coast. There is no coasting. Not in this business. If you're not constantly reading and trying out new things, your salary is headed down.
I've experienced many distinctly average coasters who have gone upwards quickly. It ain't a meritocracy; being able to politick and play the game is just as important, sadly.
The article suggests you can teach everything a $150k programmer knows to a fresh grad $60k programmer. Damn I want to know about these training techniques. It's usually 5-10 years depending on domain.
I have a silly remark to make about this article... Isn't it also because current programmers in their 50's have basically less programming experience than the ones in their 30's ?
I had one of the first usable family computer when I was 6 and I am already in my late 30's, if you have let's say 20 more years of experience than me, on what computer did you have them ?
My bet is that it is a picture of what we have now and not a tendency.
"Why would any company pay a computer programmer with out-of-date skills a salary of say $150,000, when it can hire a fresh graduate — who has no skills — for around $60,000? Even if it spends a month training the younger worker, the company is still far ahead."
$60,000 (in the Valley?) + A whole month of training = I had to stop reading the article
Problems like this will never get fixed. Too many Americans are too concerned with watching American Idol to pay attention to the fact that most politicians (on either side of the fence) are trying their best to destroy the middle class by padding the pockets of business executives and high net worth investors. Citizens United, H-1B visas, and many other examples show this.
The consumer web is often a culture of 'pop hits' and lots of media; hence the justin bieber-esque love affair the Valley has for said entrepreneurs. So, yes -- I do believe that people in their 20-30's are better at identifying these fads moreso than 40-50. But they're just fads, and fads usually die fast like the startups born from them. Certainly not all, but most.
That said, I also believe the consumer web is in for some pain (the notorious A crunch). I get the impression a lot of unqualified 'super angels' felt it was more important that entrepreneurs become professional money raisers and media darlings, rather than help them focus on their core business.
When it comes down to it, if you're a seasoned entrepreneur with repeat successes – you'll be unstoppable for the rest of your life. The Biebers of startups are edge cases that simply get a lot of attention because it compliments media's goal for driving pageviews.
But that's the thing. I know some old people (Old, here, is half a decade or better.) who are still purely technical, but the ones I know that are fully employed are really good. I mean, not just "I've been doing this longer than you've been alive" good, but better than I would be if I had two lifetimes to practice. And they generally don't job hop like the youngsters, either, a sign of fear of joblessness. (I mean, that's all anecdotal, but eh, for most of us, that's what we use to sanity-check the statistical data)
My anecdotal data lines up with the statistical data.
There is this perception that you can take a young person and train them fairly easily; that this is the thing to do. I think with someone older? it's not so much that hiring managers don't think they can be trained as that it's /weird/ on a cultural level, for a 25 year old kid to tutor some 50 year old. Really, I think that's a big part of the problem.
Now, I think the other side of that coin is that for most of us? we hit our 30s, those raises start slowing down, and we start looking for other giant gains.
I mean, through my teens and twenties, a year without a 20% raise was a disappointment. And if anything, the raises lagged increases in my actual effectiveness. In my late 20s, and early 30s? that slows down a lot. I'm looking around for that next productivity jump, and hey, turns out all those social skills I didn't have when I was younger? I am not saying I'm smooth or anything, but hey, I'm a hell of a lot better than I was. It looks to me like there is some low-hanging fruit (productivity wise) in management.
So that's the other side of the coin; Most of us? a decade or two into our careers, well, we start looking at management. That explains some of the fall-offs in Individual Contributor pay; Many of those who can, make a run for management, and many who are left behind are seen as "not making the cut" (which is kindof silly, considering the different skillsets required)
Here is the ginourmous confounding issue the presentation in the article:
> Why would any company pay a computer programmer with out-of-date skills a salary of say $150,000, when it can hire a fresh graduate — who has no skills — for around $60,000?
(my emphasis)
So how do you untangle the ageism issue from the skills issue? This article doesn't, but for the broader question you kinda have to.
How do you control for skills when finding out about ageism?
I'm pretty sure there's ageism in tech. This makes me a little scared. It's this exogenous thing that I can't control.
I'm pretty sure skills-decay is at least as common as ageism. Skills have market prices. They change.
This reminds me of a (PG?) essay that said essentially, if you're gonna call yourself a developer, don't call yourself an [X] developer, because you are not the language [X]. I suspect it's all the [X] developers out there who are seeing the most "ageism"
Right off the top of my head I can come up with all sorts of factors that are totally neglected in this article, for example:
Survivor Bias: the best and brightest get promoted. This reduces the average 'quality' of the remaining people in the original pool, even though none of the people changed in any way.
Lets say you have a team of engineers, and 5 of them just turned 40. One of those over the hill engineers just got promoted to be a VP, and another one leaves to be an independent consultant. Don't you think the two that left the team were probably among the best in those 5, and therefore were more highly paid than the other 3? The average salary of your 40 year old engineers just went down, even though the average salary of the original 5 likely went up by more than 300%.
After talking with a few high school teachers, and considering my own experiences with those at the college level, I believe that today's entry-level engineer is comparably equal to (if not a step below) the previous generation's entry-level engineer in terms of programming ability.
If this is true, and the trend continues, then any ageism based on aptitude could reach a point of diminishing returns very soon.
I would also challenge the idea from the article that "the young understand new technologies better than the old do." Which I think is less true today than it was a few decades ago.
I believe this apparent ageism is a result of pattern matching and cost cutting moreso than a widely held belief that young engineers can outperform more experienced ones.
Articles like this terrify me. I'm in my late 30s and I'm only now realizing that what I want to do is code.
I'm a wet biologist by training and have been doing research and associated work for a decade. I have little opportunity to code at my current job. What I'm hoping to do is to formalize and hone my programming skills through some additional university courses.
Am I wasting my time? Am I forever going to be kicked to the bottom of the pile of applicants for junior positions because I've already had a career?
If you've been in a similar situation, I'd love to hear how it worked out for you.
Let me tell a little about what happens to me. After 4 years in my first job programming, I got burned about it, so I changed to be a consultant for a year and having my own non-tech business (a shop) for two. Then I came back to programming, after being for three years (late twenties) without doing programming stuff... And it was a successful comeback, as I got back with passion, and learned a lot in a few amount of time, so I really catch up, in terms of career, with friends that started at the same time, and never quit programming.
The great thing about coding is that you can show how good (or bad) you are relatively easy (compared with other fields). There is also a real shortage of coders right now. If you are good, and you like to code, you can catch up and stand up over a lot of people that has been doing one year, ten times.
I'd say that one thing you can to do is show up your code (through open source, etc), extra bonus points if you do something that is useful, and whenever someone asks you you can say: "Do you know X? I did that" ;-)
Not in a similar situation, I'm in my late 20's, a programmer who have done Bioinformatics research in academia and in Pharma R&D.
Programming done in most startup's and corporate settings are very similar to lab-work. The young grad students are lured by PI's to do repetitive work with promise of publication when the reality is it's a lottery. After a few years, what is looked upon as glamorous and interesting by onlookers of the high-end instruments and high-impact research, will turn into mundane and repetitive lab protocols that's intellectually stale and unstimulating; any results and interpretation is only esoteric and vague in the academic sense.
The actors are different but the characters are the same. PCRs, blots will be replaced with repetitive coding exercises being asked of you by project managers. PIs will be replaced by MBA bosses. The academic grind for glory amongst the sub-field of 5 people will be replaced by maximizing profit in the business logic of the sub-field of the company you are in.
The pay is slightly better or equivalent; the job security much worse. People threw around the 150K mark here as an average developer salary. It's analogous to say that the average lawyer makes 220K. It's not true. After working for about 5 years in Boston, most of my peers are getting salaries around the range of 90-110K. Only people I know who are getting over 150K at programmer level live in SF which in their case is not very much. Most software engineering job req's I looked for in Boston tops out at 150K, this includes senior positions for 10-15 years of experience and at well-funded companies.
If your goal is to achieve intellectual autonomy and financial independence by becoming a programmer, it's very difficult. The sub-culture on this forum skewers towards college students and recent grads who are more naive about the field. Others might give more defensive answers to your query, but I want to give you a honest opinion.
I chose coding over Biology because it was attractive for a young person out of school and didn't want to get on the grad school treadmill. But if I inherited a lot of money suddenly, I'd want to apply to Biology grad school and do research for fun, without caring about my PI and departmental politics; and/or work on open-source games without caring about IT career jockeying or monetization. Just food for thought.
In my view, it's not ageism that's the problem. It's the questionable paradigm of the annual raise. Over the course of the years, a developer's salary will increase steadily, and make her more and more expensive and less competitive against less experienced developers.
If you're willing to take a junior dev position for junior dev pay, I would think that in general you won't be disadvantaged because it's your second career.
nah, just code in a biotech or for lab software. There is so much opportunity. Automate through code/robotics your job. Plenty of opportunities.
You will find companies that want a mix of wet lab and coder skills. You save them on the translation costs for making software the biologists actually need.
The biggest difference from when I was a starving 19 year-old is that I don't need to use recruiters to find work anymore. It's strange that this article comes from LinkedIn and it does not mention the power of having connections. My hope is that by lifting up the people around me wherever I go, I'll build up a larger, better, and more enthusiastic network of colleagues whose help I can draw on later.
The HN reaction to ageism is interesting--the comments are extremely skeptical, but of course, HN suffers from survivor bias. Those who wash out of software development in their 30s and 40s don't read or comment on HN. In fact, wasting time on social news sites is a young person's game even if ageism were not a factor.
I worked with a talented C++ programmer who had flown airplanes in the Pacific Theatre in WWII. I don't think he necessarily needed the money as much as just enjoyed the job but it goes to show that you can still write code when you're older. We always enjoyed taking him out to lunch on Veteran's day too!
[+] [-] jasonkester|13 years ago|reply
The next year, it was "3rd grade" when teachers would start tearing up non-cursive papers, then 4th, 5th, 6th and "definitely in junior high". But it never happened in reality.
In my 20s, the Big Scary Thing In The Future was ageism in tech. Once you hit 30, you'd never find work again in this industry.
Then it was 35. Then 40.
I've given up listening. Every year, in addition to getting better at what I do, I find that more people want to pay me more money to come program computers for them.
Now it's certainly possible that the real number is 45, and you'll find me living in a cardboard box and begging for nickels at the off-ramp in a few years. But at this point I'm not overly worried about this particular myth.
[+] [-] rayiner|13 years ago|reply
I know lots of guys working as engineers in their 50's. People hire them, because they have an expertise. Not a niche just by virtue of being old (like COBOL) but a problem domain they know inside and out. Nobody is tossing out their 45 year old engineer that works on carrier grade network software in favor of a fresh graduate.
[+] [-] k__|13 years ago|reply
I am 27 and work as a GUI-dev. My co GUI-devs are 26 and 32 years old.
But the back-end developers are 38, 43 and 56 years old.
The management in here consists of engineers, too. They are all >40 years old.
Most tech-people I know either got in to back-end development or management when they got older.
[+] [-] compay|13 years ago|reply
[+] [-] michaelochurch|13 years ago|reply
My observation is that, as you get older, the jobs available to you get better but they also get rarer. That's partly because you filter out the bad jobs. When someone makes you sign a full-on non-compete to take a coding test-- not just an NDA covering the material in the test, but the works-- you just don't return the email. Anyway, what it means is that when you do get a job, you're more likely to find quality, but you can no longer count on a new job in 2 weeks if your existing one ends, especially because after 35+ you are going to be sized up for some kind of leadership potential, making fit demands very high.
[+] [-] el_fuser|13 years ago|reply
[+] [-] paulbunn|13 years ago|reply
[+] [-] OGinparadise|13 years ago|reply
Can you imagine Bill Gates, in college or no college debate, writing about quitting college, listing his milestones and then fast-forward to $60+ billion in net-worth.
[+] [-] dave_sid|13 years ago|reply
[+] [-] el_fuser|13 years ago|reply
[+] [-] jdminhbg|13 years ago|reply
> Brown and Linden’s analysis of Bureau of Labor Statistics and Census data for the semiconductor industry revealed that although salaries increased dramatically for engineers in their 30s, these increases slowed after the age of 40. After 50, the mean salary fell by 17% for those with bachelors degrees and by 14% for those with masters degrees and Ph.Ds. And salary increases for holders of postgraduate degrees were always lower than for those with bachelor’s degrees (in other words, even Ph.D degrees didn’t provide long-term job protection).
> It’s the same in the software industry. Prominent Silicon Valley investors often talk about youth being an advantage in entrepreneurship. If you look at their investment portfolios, all you see are engineers who are hardly old enough to shave. They rarely invest in people who are old.
The first paragraph, which contains the data that gives the veneer of respectability, is about the semiconductor industry. Even then, salaries don't actually decrease until people hit the beginning of retirement age (surprise!).
In the second paragraph, we switch to the software industry, where it's "the same" (no data to support that, natch). The supporting anecdote isn't even about employees, but investees... What proportion of people receiving money in the software industry do so via investment rather than a paycheck?
Of course the specter of ageism haunts everyone, so the linkbait is effective and we have 60+ comments here.
[+] [-] jgrahamc|13 years ago|reply
Now here's the key claim: "increased dramatically for engineers during their 30s but that these increases slowed after the age of 40. At greater ages still, salaries started dropping, dependent on the level of education. After 50, the mean salary of engineers was lower—by 17% for those with bachelors degrees, and by 14% for those with masters degrees and PhDs—than the salary of those younger than 50. "
Two points:
1. Would we expect a linear increase in salary with age? I would not. At some point the salary would likely 'top out' in any job.
2. The part about over 50s ignores a very important fact from the study: the over 50s work less (and therefore earn less). To quote the book: "Workers over the age of 50 are much more likely to work less than a full year [...]. One in six engineers aged 51 to 65 reported being paid for less than a full year of work in 2005 [...]". The book goes on to make a confusing claim about whether that's voluntary or not (jumping to the 2002 downturn and talking about interviews with engineers but with no data).
[+] [-] danmaz74|13 years ago|reply
[+] [-] TheCoelacanth|13 years ago|reply
[+] [-] nicholassmith|13 years ago|reply
No they don't. The young, smart developers who already have strong backgrounds adopt new technologies, the old, smart developers do the same.
"The young can easily pull all-nighters."
Sure they can, and companies should be moving away from that as the code quality does drop off past hour 12 unless they're a super talented developer. In which case they're probably smart enough to know not to work an all nighter.
If you're not willing to pay $150k for a great developer that's going to get shit done, you're fucked already.
[+] [-] michaelochurch|13 years ago|reply
If you use a high-productivity language like Clojure or Haskell and work on a green field, you'll find that you can't program for 12 hours straight, because there's no fat in the development process. Your brain starts to hurt after 8-9 hours. Personally, I can get useful work done for about 11 hours per day (I seem to average 65-70 hours per week, including writing which has taken a lot of my time recently, no matter what my mode of employment) but there is no way I'd be able to write code for 77 hours per week. Maybe 55 if I really had to push.
If you're not willing to pay $150k for a great developer that's going to get shit done, you're fucked already.
My first programming job paid just over half of that (well, with a sizeable bonus, but 3x higher than the "top of range") and I got shit done. Cost-of-living is also a factor. I would probably build in Austin or Boston (it annoys me that those cities' names rhyme because they happen to be the top 2 candidates for the 2018 tech hub, as I see it) where the brains-to-dollars ratio is more favorable than New York's. (New York has plenty of brains; the dollars are a problem. Fucking rent.)
[+] [-] scrozier|13 years ago|reply
I have a lot of entrepreneurial experience and am a pretty good general business manager. Coupled with the ability to code, I find that I am in fairly great demand.
[+] [-] eitally|13 years ago|reply
[+] [-] ecopoesis|13 years ago|reply
This article confirms what I suspected, older programmers don't retire, they just never get rehired after the latest round of layoffs.
[+] [-] DanielBMarkham|13 years ago|reply
The truth, as always, is nuanced. As the author says, it's up or out. If you're 50 and expect to do be doing the same type of work you did when you're 25, you're mistaken. As programmers we have to constantly be adapting.
The problem here is getting into any kind of attitude that says that you can coast. There is no coasting. Not in this business. If you're not constantly reading and trying out new things, your salary is headed down.
[+] [-] zimpenfish|13 years ago|reply
[+] [-] justinhj|13 years ago|reply
[+] [-] gbin|13 years ago|reply
[+] [-] nsxwolf|13 years ago|reply
$60,000 (in the Valley?) + A whole month of training = I had to stop reading the article
[+] [-] tosseraccount|13 years ago|reply
(Source , National Public Radio ... http://www.npr.org/blogs/alltechconsidered/2013/04/03/176134... )
What H-1B Employers Say
NPR repeatedly tried to interview the biggest H-1B users, but none agreed to talk.
No one is talking be we all know what's going down.
[+] [-] nikomen|13 years ago|reply
Thanks for posting this, tosseraccount.
[+] [-] pxlpshr|13 years ago|reply
That said, I also believe the consumer web is in for some pain (the notorious A crunch). I get the impression a lot of unqualified 'super angels' felt it was more important that entrepreneurs become professional money raisers and media darlings, rather than help them focus on their core business.
When it comes down to it, if you're a seasoned entrepreneur with repeat successes – you'll be unstoppable for the rest of your life. The Biebers of startups are edge cases that simply get a lot of attention because it compliments media's goal for driving pageviews.
[+] [-] lsc|13 years ago|reply
My anecdotal data lines up with the statistical data.
There is this perception that you can take a young person and train them fairly easily; that this is the thing to do. I think with someone older? it's not so much that hiring managers don't think they can be trained as that it's /weird/ on a cultural level, for a 25 year old kid to tutor some 50 year old. Really, I think that's a big part of the problem.
Now, I think the other side of that coin is that for most of us? we hit our 30s, those raises start slowing down, and we start looking for other giant gains.
I mean, through my teens and twenties, a year without a 20% raise was a disappointment. And if anything, the raises lagged increases in my actual effectiveness. In my late 20s, and early 30s? that slows down a lot. I'm looking around for that next productivity jump, and hey, turns out all those social skills I didn't have when I was younger? I am not saying I'm smooth or anything, but hey, I'm a hell of a lot better than I was. It looks to me like there is some low-hanging fruit (productivity wise) in management.
So that's the other side of the coin; Most of us? a decade or two into our careers, well, we start looking at management. That explains some of the fall-offs in Individual Contributor pay; Many of those who can, make a run for management, and many who are left behind are seen as "not making the cut" (which is kindof silly, considering the different skillsets required)
[+] [-] coldcode|13 years ago|reply
http://thecodist.com/article/yes_i_still_want_to_be_doing_th...
[+] [-] nohuck13|13 years ago|reply
(my emphasis)
So how do you untangle the ageism issue from the skills issue? This article doesn't, but for the broader question you kinda have to.
How do you control for skills when finding out about ageism?
I'm pretty sure there's ageism in tech. This makes me a little scared. It's this exogenous thing that I can't control.
I'm pretty sure skills-decay is at least as common as ageism. Skills have market prices. They change.
This reminds me of a (PG?) essay that said essentially, if you're gonna call yourself a developer, don't call yourself an [X] developer, because you are not the language [X]. I suspect it's all the [X] developers out there who are seeing the most "ageism"
But they what do I know, I'm in my 20s.
[edit: typos]
[+] [-] justin_vanw|13 years ago|reply
Survivor Bias: the best and brightest get promoted. This reduces the average 'quality' of the remaining people in the original pool, even though none of the people changed in any way.
Lets say you have a team of engineers, and 5 of them just turned 40. One of those over the hill engineers just got promoted to be a VP, and another one leaves to be an independent consultant. Don't you think the two that left the team were probably among the best in those 5, and therefore were more highly paid than the other 3? The average salary of your 40 year old engineers just went down, even though the average salary of the original 5 likely went up by more than 300%.
[+] [-] tuxidomasx|13 years ago|reply
If this is true, and the trend continues, then any ageism based on aptitude could reach a point of diminishing returns very soon.
I would also challenge the idea from the article that "the young understand new technologies better than the old do." Which I think is less true today than it was a few decades ago.
I believe this apparent ageism is a result of pattern matching and cost cutting moreso than a widely held belief that young engineers can outperform more experienced ones.
[+] [-] lunchladydoris|13 years ago|reply
I'm a wet biologist by training and have been doing research and associated work for a decade. I have little opportunity to code at my current job. What I'm hoping to do is to formalize and hone my programming skills through some additional university courses.
Am I wasting my time? Am I forever going to be kicked to the bottom of the pile of applicants for junior positions because I've already had a career?
If you've been in a similar situation, I'd love to hear how it worked out for you.
[+] [-] jaimebuelta|13 years ago|reply
The great thing about coding is that you can show how good (or bad) you are relatively easy (compared with other fields). There is also a real shortage of coders right now. If you are good, and you like to code, you can catch up and stand up over a lot of people that has been doing one year, ten times.
I'd say that one thing you can to do is show up your code (through open source, etc), extra bonus points if you do something that is useful, and whenever someone asks you you can say: "Do you know X? I did that" ;-)
[+] [-] noname123|13 years ago|reply
Programming done in most startup's and corporate settings are very similar to lab-work. The young grad students are lured by PI's to do repetitive work with promise of publication when the reality is it's a lottery. After a few years, what is looked upon as glamorous and interesting by onlookers of the high-end instruments and high-impact research, will turn into mundane and repetitive lab protocols that's intellectually stale and unstimulating; any results and interpretation is only esoteric and vague in the academic sense.
The actors are different but the characters are the same. PCRs, blots will be replaced with repetitive coding exercises being asked of you by project managers. PIs will be replaced by MBA bosses. The academic grind for glory amongst the sub-field of 5 people will be replaced by maximizing profit in the business logic of the sub-field of the company you are in.
The pay is slightly better or equivalent; the job security much worse. People threw around the 150K mark here as an average developer salary. It's analogous to say that the average lawyer makes 220K. It's not true. After working for about 5 years in Boston, most of my peers are getting salaries around the range of 90-110K. Only people I know who are getting over 150K at programmer level live in SF which in their case is not very much. Most software engineering job req's I looked for in Boston tops out at 150K, this includes senior positions for 10-15 years of experience and at well-funded companies.
If your goal is to achieve intellectual autonomy and financial independence by becoming a programmer, it's very difficult. The sub-culture on this forum skewers towards college students and recent grads who are more naive about the field. Others might give more defensive answers to your query, but I want to give you a honest opinion.
I chose coding over Biology because it was attractive for a young person out of school and didn't want to get on the grad school treadmill. But if I inherited a lot of money suddenly, I'd want to apply to Biology grad school and do research for fun, without caring about my PI and departmental politics; and/or work on open-source games without caring about IT career jockeying or monetization. Just food for thought.
[+] [-] smartician|13 years ago|reply
If you're willing to take a junior dev position for junior dev pay, I would think that in general you won't be disadvantaged because it's your second career.
[+] [-] sjg007|13 years ago|reply
You will find companies that want a mix of wet lab and coder skills. You save them on the translation costs for making software the biologists actually need.
[+] [-] pramodliv1|13 years ago|reply
[+] [-] saintx|13 years ago|reply
[+] [-] codex|13 years ago|reply
[+] [-] zimpenfish|13 years ago|reply
[+] [-] krsgoss|13 years ago|reply