This kind of change has also impacted me. It shows up when I'm trying to give students advice about starting their careers, and I realize that the first jobs I had (system administrator, SOC worker) have been replaced by robots. Especially in the SOC, I was a "Tier 1" analyst that would do monitoring (watching a bank of green lights waiting for one to turn red) and first level triage and analysis. This has been replaced by ML driven data processing systems.
So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent. Meanwhile, universities and vocational ed programs won't get this memo for another ten years so they will continue to happily propel waves of students onto a set of closed and locked doors.
The pessimists' view is that automation will deprecate a heap of jobs in the tech industry that will never return. The optimists' view is that automation simply allows companies to do more stuff: things they couldn't afford to do before, and soon, things they have to do in order to stay competitive. For the optimists the number employed in the tech industry stays the same or increases, but the proportion of different roles changes (ie no more green light watchers).
Was watching David Bull talk about historical Japanese wood carving. He described how the introduction of the printing press to Japan killed the entry-level, apprenticeship positions in printing.
I am worried about the latter aspect as well - in order to keep automation going, very advanced developers would have to be involved; if the whole "easy job" ecosystem disappears, there won't be any reasonable way to keep developers progressing, with best in a competition filling up the spots at "cognitive automators".
Maybe we'll see a more apprenticeship type approach, where junior personnel are instead assigned to and trained by seniors. This would probably be a net good, but who knows how things will shake out.
Why does watching a bank of green lights waiting for one to turn red and do first level triage and analysis require ML, isn't this just a bunch of rules?
The one entry level IT job that is not going away is tech support. Sure, some parts of it can be outsourced, but beyond a certain point you need a person on-site to figure out why the Internet is broken.
The Cloud companies are also hiring armies of support people, and it's a great way to kickstart your career in any of these companies while getting company provided training in the tech.
"So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent."
Agreed entirely. I would be surprised if we still have two-year technical degrees in a decade.
This has been happening since the 1970s. Or earlier. I don’t really think of it as an apocalypse. IT skills have never had a long shelf life. Any time you are a technology expert at your company, in IT, the technology landscape will shift under you. This is the Red Queen Hypothesis in action. People who ran mainframes in the 1980s became trusted experts and then most of the jobs evaporated. Same thing happened to people running critical VAX or Unix systems. Your skills are only valuable as long as the related technology is.
The same thing happens to programming positions.
But I think the good news is missing from this article—IT jobs are, overall, sticking around or increasing in number. (According to the Bureau of Labor Statistics, the jobs are growing “faster than average for all occupations”). You do have to keep updating your skill set, but it’s not like manufacturing, where efficiencies eliminate jobs altogether or move them to completely different sectors. And there is that ageism to worry about, and uncertainty.
I’m personally more worried about some of the other remaining white-collar office jobs, like the accountants, paralegals, HR, various banking positions, etc.
> The same thing happens to programming positions.
I can't stress how important this is. Folks going to things like boot camps or other educational outlets that focus on one language will utterly kill their career if they aren't aware of how fast things move. If you don't learn the underlying abstractions and paradigms that take various forms in different languages, you will get left in the dust in a matter of a few years.
The best programmers I've ever worked with got excited about programming patterns and paradigms, not frameworks and syntactic sugar. Those are also the ones I paid the most attention to.
Bottom line for both software and IT engineers: you learn to learn, not just to do.
There are hundreds of thousands of people or System Administrators that have made life long careers out of managing networks, server farms, windows and linux systems since the early 90s. It's not fancy as software development or drives business value but work that needs to be done.
The Cloud greatly diminishes and in some cases completely eliminates that work. The only thing left is actual software development.
This. Check out my username, this is what I do, and I've been doing it for far longer than a decade.
The mid-range jobs have always been vanishing. Many times, I'm the guy automating them out of existence. It's always replaced by something else.
It depends on the circumstances, of course, but there is often more work after the automation than before. It's just different work. It requires reskilling.
The article mentions some new product AWS is coming out with. No matter how "simple" it makes things, someone is going to end up being an expert at using it, and will probably be paid well to do so.
Really, the toughest and most crucial part of this career has been keeping up. The work stays steady, though.
The problem is most companies want someone with 3-5 years of experience in that new skill set. So, unless you jump on that particular band wagon early you're out of luck.
"I’ve spent my career in tech, almost a decade at this point, running about a step-and-a-half ahead of the automation reaper."
I mean, yes, that is the entire job description. If you are in IT, your responsibility is to learn the best technologies, and be continually re-evaluating what to keep of your organization's current and what to improve or replace.
That is why I wouldn't want to do anything else. I love learning new things and no profession offers more opportunities to learn new things than computer technology.
If your skill was “DB2” or “Oracle” or “Cisco” or “C++” you could have had a 30-40 year career in that, easily. There are plenty of others. Java has been around commercially since about 1995, there will definitely be plenty of Java jobs in 2025.
Another worry is recession and the popping of the tech bubble. Sooner or later the bubble has to popped and that's going to cause a lot of pain throughout the tech industry. Should be interesting to see how the industry rebounds and in what form. Will VR be the new "hot thing" like smartphones were after the 2008 recession.
I agree with the points raised in the writing, but it mixes automation, abstraction, and industry consolidation as if they weren't separate processes. As such, the transformation being described isn't an impending cliff, but an ever-present pressure of economic forces that affect all business all the time, and one is wise to watch for.
Automation replaces repetitive work with tooling and work that's more complex. Abstraction allows one to delegate to another for details, which may include choosing from a palette of pre-made options. Consolidation will come about as fewer independent players can sustain themselves in the market. Some will be out-competed by economies of scale, some will be starved by restrictions on intellectual property and lack of access to expertise.
This process has already played out for "small business websites", yet there's still lots and lots of web developers and web designers employed or freelancing. The current wave of WYSIWYG website generators is actually very good, and they have add-ons and integrations that make sense for their target market. But plenty of clients don't want to mess around in it, so they'd rather hire someone. This could be maker of the generator, or it could be an outside consultant. In either case, the person brings judgement, experience, and creativity, to tailor the deliverable to the needs of the client. These are skills resistant to automation, but not immune to abstraction and consolidation.
In the end, the antidote is the same as it always was: be adaptable, be personable, be resilient, and be resourceful. These are especially important in one is in a comfortable job shielded from most competitive pressure, because they will be the most surprised and unprepared if their current employment is made redundant.
I keep seeing thse kinds of articles where the author has drank the Cloud kool-aid themselves, forgotten how to function without it, and insists that it's impossible to function without it. This guy even drags manufacturing into the mix, and obviously has no idea about manufacturing in the United States.
Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.
We don't make 100,000 of anything either. We make 100 gyroscopes for General Dynamics, or 5 jet engines for General Electric. We make US military grade munitions and weapons for the government. The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD. All that great automation that helps AirBNB function with no infrastructure is meaningless when you have to protect your IP from nation state actors. To probably >50% of American manufacturing the Cloud is useless. It's a consolidated attack vector that WILL be compromised in the future and lead to liability. Sure you can put a NIST 800-171 or DFARS compliant business in the Cloud, but it costs extra and it's not worth the risk. You hear about misconfigured buckets leaking data almost daily. Nobody doing govermnet manufacturing work wants to deal with that headache. Infact, I've been in this industry for 10 years and I have NEVER seen a DFARS compliant supplier with outsourced IT infrastructure. I've visited hundreds of companies over the years. What you're describing doesn't interest American manufacturers one bit.
> The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD.
This is probably going to change. People like you said the same thing about health data, and student data. The savings were so tantalizing that the regulators and stakeholders figured out how to make it work. What do you think GovCloud is for? C2S and "Secret cloud"?
Our university had a 3-4 person dedicated Exchange team. When "Google Apps" came out, people wanted us to switch to that from our old mail server stuff. Go figure, why would you keep using pine and squirrelmail when you could use gmail? "It can't hold student data" the IT team said, "it isn't certified for FERPA or ITAR." Okay, true. Fast forward two years, now Google's "Apps for Education" can deal with both. The switch was sudden and brutal and the university no longer has a 3-4 person dedicated Exchange team or an Exchange deployment of any kind.
> Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.
This a blanket statement and is wrong. Most cheap manufacturing is done over seas but the US still has a large manufacturing sector that makes all sorts of crap.
>> Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.
There are still people making nails in the US. Fertilizer. Food gets exported. Then there is all the stuff to too expensive to ship. Lumber, aluminum sheeting, cement ... lots of non-precision stuff is still made locally. Not every US factory makes munitions.
And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage. I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer. (It's a device for measuring laser energy at specific wavelengths but I have some specific needs re how the data is collected/displayed. It only took a 10-minute call to explain my issues and get a deal together.)
They probably are allowed to use the cloud, it just requires a lot of red tape and paperwork: filling out forms, waiting, and filling out more forms. Pretty sure AWS GovCloud exists for a reason, and that reason isn't because it has no customers.
May not be allowed to use cloud today but that will likely change in the near future. FWIW, I can imagine your post in my head as an argument in favor of horses over automobiles.
What you’ve seen in 10 years was reality during that time. That speake nothing little to the future.
Good points, but there are other American-dominant industries aside from defense manufacturing, and they are definitely taking a keen interest in the cloud. Also, defense is moving to the cloud too, albeit more slowly than AirBnB, say.
The article is not talking about IT at Raytheon but "[...] anonymous Windows administrators and point-and-click DBAs and “senior application developers” who munge JSON in C#".
> Repetition is a sure warning sign. If you’re building the same integrations, patching the same servers over and over again every day, congratulations – you’ve already become a robot. It’s only a matter of time before a small shell script makes it official.
Absolutely - if something is repetitive, it's a candidate for automation. This is true across all disciplines. Only the as-yet unautomatable human judgement, insight and communication is safely valuable.
On the other hand, "go away or I will replace you with a very small shell script" has been a BOFH joke since the 90s.
While I think this essay has some good points, it ignores the problem that I always see with this idea - people. I really wish the business people I deal with on a daily basis could have their business requirements met by such automation, cause it's the least fun part of my job. But the don't because I don't care how awesome your cloud provider tools are, they always come to me with some weird requirement or platform and I'm back to "munging JSON in C#".
The problem isn't the technology, it's the complexity of the customer's business requirements, and their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves. I would love to see more tooling to help with this. I have been waiting for 25 years. It gets better, but not nearly what can be described as an apocalypse.
A point that often gets missed with "low-code" tools, is that it's not so much that they enable "non-coders" to build applications, but that they enable experienced developers to go so much faster.
I've been using a fullstack low code development tool for several years now, and when it comes to developing CRUD apps or data reporting apps (with charts, interactive, drill down reports, etc. etc.), it's astonishing how quickly you can stand-up a secure, fully responsive web-app, complete with authentication, authorization schemes, report subscriptions, etc., without writing any code at all.
And, when you bump up against the limits of the declarative/low-code aspect of the framework, you can toggle over to java script, your own CSS, SQL, etc., so it's not like you paint yourself into a corner.
So, I agree, if Amazon creates something like this, and it is as good as some of the existing low-code tools out there, it's going to have a big impact over the long term.
As a senior employee in an organization that has migrated much functionality to SaaS and the cloud... I'm doubtful. In my experience so far, what we do changes, but the need for IT employees hasn't gone down. Most of what we did was figure out how to solve business problems with IT, and that continues, cloud or no. SaaS offering are sophisticated, but hard to use out of the box when you have significant regulatory (and other) requirements.
If there are IT jobs developing for smaller organizations, maybe those will go away, but... I think a lot of that disappeared already.
I'm close to retiring (from this job, anyhow), so it's not a personal issue for me. I just haven't seen it happening as described.
I think for most mid level guys the threat isn't AI doing their job. I think its the influx of people who will undoubtedly join the "coding" workforce once automation takes away many trivial jobs like driving trucks, cabs, or flipping hamburgers at McDonalds. All these people will be told to retrain themselves and go into the tech sector, and we'll get a huge influx of cheap tech people.
We've been through this before. It was called Windows NT. The democratized tools were Access and Lotus Notes. Amazon is doing this for the same reason Microsoft did -- it's revenue by a thousand cuts. People spent $5,000 in 1995 to give diesel mechanics PCs so they could update work orders -- they will do the same for little apps.
The reality is, you're going to have a million monkeys hitting a million keyboards, and very few will be producing Shakespeare. All of that crap will be consuming lots and lots of AWS/Azure/etc bill.
You'll need way more IT people to rationalize it. There are tens of thousands of people in the United States whose purpose for the last decade has been re-implementing the 90s version of this in formal IT systems. You will have churn as we purge the legacy staff, especially windows click to admin types.
Automation hits white collar professions as a massive productivity improvements for the top performers displacing everyone else working in the field (think the top 10% of people in your position doing 100% of the work).
In web development this is most apparent (to me) in SAAS application development, where many/most of the underlying pieces of building a CRUD application that can scale to thousands of users, and be really functional are now provided by other SAAS apps which provide a _better_ service than the average developer can scrape together themselves.
Billing -> stripe.com over writing against the gateways directly
Database/Hosting -> Heroku PostGres/Redis and compute
Email -> Sendgrid, Mandrill, ActiveCampaign
Or even just SAAS frameworks like BulletTrain (Rails) or Laravel Spark which dramatically cut down on the boilerplate and integration code you'd have to write.
As someone who's been in IT over a decade, I am concerned and so many IT folks are going to be blindly hit.
Sure, some of them will still have positions the same or similar roles but there will be a crunch. The large outsources will be hit overseas (WiPro, Infosys, etc.) but it will also impact administrators at medium-large sized businesses in typical American Cities as Forrest mentioned. The worst part out of all of this is too many colleges and especially technical colleges still teaching networking, linux or windows administration as if they'll be able to have life long career. That is no longer true.
I don't want to imagine what it'll be like for those students who graduate, get good jobs (now), a mortgage and start to raise their family only to find themselves unemployed in the middle of their lives. I don't expect much sympathy from the largely meritocratic tech industry or anyone else.
As for myself, I already work for one of the big three and apart of many "cloud" migrations. I should be okay but at the same time I am somewhat conflicted. Am I going to need to go back to school for Computer Science and become an fully-fledged actual software developer? I mean, it's fine, there's still enough time (I don't think we will really feel the burn for at least another 4-6 years) but is it reasonable or realistic that everyone needs to be rockstar developer?
Programmers have been predicted to be going out of jobs since the days of COBOL. There is a reason that is not happening(yet), or hardware companies would have been all shipping pluggable chips, we'd just config-connect them and be done. The reason is in the word itself `Software`.
The real problem with these ready-made plumb-and-plug modules is sooner or later these are either too slow, or expensive, or just a pain to refactor/redo. Eventually you just come back and realize you need a more granular control over things, and anything you are likely to come up with resembles a programming language.
I had this moment of realization myself while having to change a complicated graph in Pentaho Kettle a few months back. The graph looks bonkers hard and brittle, changing anything requires redoing all the dependent elements of the graph, and if you have a graph complicated enough you will be forced to rewrite it. The real trouble there is no functional/unit testing with these things. And then you realize, you are just better off with a full fledged ETL language/programming language. The second problem I faced was running into performance issues. Want to change the sort algorithm? Running into heap space issues? Want better logging? Want a better threading model? All the best. Nothing is possible.
This is above and beyond the need for meta-programming facilities. At that point whatever GUI graph you draw is worse than any verbose code you will write.
Regarding programmable tools, we already have those. Vim, Emacs, Microsoft Excel all give you a degree of meta control over the tool and what you want to do with it. But that's that, and it is often hard to bend this tools to your command.
These are just a few reasons why there won't be an apocalypse soon.
I empathize with this sentiment, and will add that it may take fewer developers/IT folks to get a product out the door (MVP), but a company that depends on the product will have more specific business requirements and will eventually onboard more people to deliver solutions specific to those needs.
The job market will close up a bit, but right now tech is looking like the California gold rush, where 4-5 years ago any bootcamp grad could jump right into a web dev job (at least in my job market in the Midwest). I think if you continuously learn and remain marketable as the times change, then as a worker you will be fine. I also like the comment that mentions that you may just end up working for the cloud provider rather than the business application company.
I manage a machine learning team and I also think that at least partially automated data curation and modeling will reduce the number of people required in my field. It might take 5 or 10 years, but I think it will happen.
I think you are spot on that IT and devops will take a hit. I look more at Heroku’s model that AWS and GCP as the future. That said AWS and GCP will keep getting more ‘Heroku like’.
Heroku and its parent company, Salesforce. Salesforce databases are code and no code manipulated and should already carry most of the business and customer data and offer several different off the shelf products for plugging other data sources in. For most companies, it could cover the majority of what their legacy IT department does, and for many companies in the Bay Area it handles everything, except their product and what’s in JIRA. For others, it is woven into the product.
It’s a complete blind spot for most engineering minded people because they never realized how flexible the platform was, and with Bret Taylor running the show now, it’s miles away from just being a clunky Sales CRM.
"I manage a machine learning team and I also think that at least partially automated data curation and modeling will reduce the number of people required in my field."
I mean, if it doesn't, what the hell are you even doing?
The whole point of technology and modern capitalism is to increase automation, increase the amount produced by the same number of workers, and increase the overall amount of wealth in the world and improve overall living conditions for everyone (setting aside very important questions of distribution). I just find it odd people in the computer technology industry find this shocking or especially worrying.
"I look more at Heroku’s model that AWS and GCP as the future."
Google's App Engine was much closer to the Heroku approach, and the AWS approach won. So I will be pretty surprised if the Heroku approach wins out.
20+ year technology consultant here. I have done work for dozens of clients across just about every major industry out there. The number of (relatively well paid) people I've seen at clients in this neverland between Business and IT whose jobs revolve around pulling data from one system, munging it offline, and then loading it into another system, or similar tasks that should have been automated with a script a decade or more ago, is absolutely staggering.
I object to the word "apocalypse", it is just business as usual.
Better automation has been "reducing the number of people required to deliver technical solutions" for ages.
Local Area Networks replaced many mainframe computers in the 80's. Optimized C compilers took the jobs of countless Assembly programmers. WordPress, Joomla and better web frameworks (Django, Rails) took the jobs of many Perl/Web developers. Python enabled a lot of people to do what FORTRAN/Java/C++ programmers were able to do before.
I'm advertising for 2 roles at the moment: a Senior Backend engineer and a Junior Frontend (in London, UK). Almost impossible to find someone for the backend role, but I've had to turn off the advertising as I've had 71 people apply for the junior frontend role. The mix is fascinating, a lot of ex-bootcampers, some CS grads, some self-taught people, but all of them are desperate for a shot to get into our industry. I've found this quite worrying as a signal for what's happening in the wider economy.
I'm waiting for it to happen to web dev so that companies can find something else to fixate on so I can go do that.
Companies are always going to follow the latest trends and it's always going to take smart people to follow them. I'm not worried about my ability to make a living. I just can't wait to see what comes.
The author and I share nearly identical work histories. I've been in IT for about 10 years, starting with AWS and database administration, then turned more DevOps with a focus on CI/CD. Over the last few years, it's become very obvious that the DevOps role is requiring more and more development skills. Simple bash scripting is not going to cut it in the modern tech company.
A couple years ago I made the switch to full time development. I now do most of the DevOps stuff for my teams, but from a developer role, instead of a sysadmin/cloudops role.
I'm certain that's going to be the future. Look at Google's requirements for SREs. They are full-fledged software engineers.
[+] [-] munin|7 years ago|reply
So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent. Meanwhile, universities and vocational ed programs won't get this memo for another ten years so they will continue to happily propel waves of students onto a set of closed and locked doors.
[+] [-] ukoki|7 years ago|reply
[+] [-] dleslie|7 years ago|reply
[+] [-] JKCalhoun|7 years ago|reply
[+] [-] bitL|7 years ago|reply
[+] [-] Wohlf|7 years ago|reply
[+] [-] kdf83|7 years ago|reply
[+] [-] jpatokal|7 years ago|reply
The Cloud companies are also hiring armies of support people, and it's a great way to kickstart your career in any of these companies while getting company provided training in the tech.
[+] [-] Ruxbin1986|7 years ago|reply
Agreed entirely. I would be surprised if we still have two-year technical degrees in a decade.
[+] [-] klodolph|7 years ago|reply
The same thing happens to programming positions.
But I think the good news is missing from this article—IT jobs are, overall, sticking around or increasing in number. (According to the Bureau of Labor Statistics, the jobs are growing “faster than average for all occupations”). You do have to keep updating your skill set, but it’s not like manufacturing, where efficiencies eliminate jobs altogether or move them to completely different sectors. And there is that ageism to worry about, and uncertainty.
I’m personally more worried about some of the other remaining white-collar office jobs, like the accountants, paralegals, HR, various banking positions, etc.
[+] [-] Rooster61|7 years ago|reply
I can't stress how important this is. Folks going to things like boot camps or other educational outlets that focus on one language will utterly kill their career if they aren't aware of how fast things move. If you don't learn the underlying abstractions and paradigms that take various forms in different languages, you will get left in the dust in a matter of a few years.
The best programmers I've ever worked with got excited about programming patterns and paradigms, not frameworks and syntactic sugar. Those are also the ones I paid the most attention to.
Bottom line for both software and IT engineers: you learn to learn, not just to do.
[+] [-] Ruxbin1986|7 years ago|reply
The Cloud greatly diminishes and in some cases completely eliminates that work. The only thing left is actual software development.
[+] [-] itgoon|7 years ago|reply
The mid-range jobs have always been vanishing. Many times, I'm the guy automating them out of existence. It's always replaced by something else.
It depends on the circumstances, of course, but there is often more work after the automation than before. It's just different work. It requires reskilling.
The article mentions some new product AWS is coming out with. No matter how "simple" it makes things, someone is going to end up being an expert at using it, and will probably be paid well to do so.
Really, the toughest and most crucial part of this career has been keeping up. The work stays steady, though.
[+] [-] sharemywin|7 years ago|reply
[+] [-] scirocco|7 years ago|reply
[+] [-] jimbokun|7 years ago|reply
"I’ve spent my career in tech, almost a decade at this point, running about a step-and-a-half ahead of the automation reaper."
I mean, yes, that is the entire job description. If you are in IT, your responsibility is to learn the best technologies, and be continually re-evaluating what to keep of your organization's current and what to improve or replace.
That is why I wouldn't want to do anything else. I love learning new things and no profession offers more opportunities to learn new things than computer technology.
[+] [-] gaius|7 years ago|reply
But this isn’t true, outside of webdev.
If your skill was “DB2” or “Oracle” or “Cisco” or “C++” you could have had a 30-40 year career in that, easily. There are plenty of others. Java has been around commercially since about 1995, there will definitely be plenty of Java jobs in 2025.
[+] [-] porpoisely|7 years ago|reply
[+] [-] sarcasmic|7 years ago|reply
Automation replaces repetitive work with tooling and work that's more complex. Abstraction allows one to delegate to another for details, which may include choosing from a palette of pre-made options. Consolidation will come about as fewer independent players can sustain themselves in the market. Some will be out-competed by economies of scale, some will be starved by restrictions on intellectual property and lack of access to expertise.
This process has already played out for "small business websites", yet there's still lots and lots of web developers and web designers employed or freelancing. The current wave of WYSIWYG website generators is actually very good, and they have add-ons and integrations that make sense for their target market. But plenty of clients don't want to mess around in it, so they'd rather hire someone. This could be maker of the generator, or it could be an outside consultant. In either case, the person brings judgement, experience, and creativity, to tailor the deliverable to the needs of the client. These are skills resistant to automation, but not immune to abstraction and consolidation.
In the end, the antidote is the same as it always was: be adaptable, be personable, be resilient, and be resourceful. These are especially important in one is in a comfortable job shielded from most competitive pressure, because they will be the most surprised and unprepared if their current employment is made redundant.
[+] [-] kpennell|7 years ago|reply
[+] [-] zelon88|7 years ago|reply
Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.
We don't make 100,000 of anything either. We make 100 gyroscopes for General Dynamics, or 5 jet engines for General Electric. We make US military grade munitions and weapons for the government. The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD. All that great automation that helps AirBNB function with no infrastructure is meaningless when you have to protect your IP from nation state actors. To probably >50% of American manufacturing the Cloud is useless. It's a consolidated attack vector that WILL be compromised in the future and lead to liability. Sure you can put a NIST 800-171 or DFARS compliant business in the Cloud, but it costs extra and it's not worth the risk. You hear about misconfigured buckets leaking data almost daily. Nobody doing govermnet manufacturing work wants to deal with that headache. Infact, I've been in this industry for 10 years and I have NEVER seen a DFARS compliant supplier with outsourced IT infrastructure. I've visited hundreds of companies over the years. What you're describing doesn't interest American manufacturers one bit.
[+] [-] munin|7 years ago|reply
This is probably going to change. People like you said the same thing about health data, and student data. The savings were so tantalizing that the regulators and stakeholders figured out how to make it work. What do you think GovCloud is for? C2S and "Secret cloud"?
Our university had a 3-4 person dedicated Exchange team. When "Google Apps" came out, people wanted us to switch to that from our old mail server stuff. Go figure, why would you keep using pine and squirrelmail when you could use gmail? "It can't hold student data" the IT team said, "it isn't certified for FERPA or ITAR." Okay, true. Fast forward two years, now Google's "Apps for Education" can deal with both. The switch was sudden and brutal and the university no longer has a 3-4 person dedicated Exchange team or an Exchange deployment of any kind.
[+] [-] ProAm|7 years ago|reply
This a blanket statement and is wrong. Most cheap manufacturing is done over seas but the US still has a large manufacturing sector that makes all sorts of crap.
[+] [-] sandworm101|7 years ago|reply
There are still people making nails in the US. Fertilizer. Food gets exported. Then there is all the stuff to too expensive to ship. Lumber, aluminum sheeting, cement ... lots of non-precision stuff is still made locally. Not every US factory makes munitions.
And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage. I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer. (It's a device for measuring laser energy at specific wavelengths but I have some specific needs re how the data is collected/displayed. It only took a 10-minute call to explain my issues and get a deal together.)
[+] [-] mr_overalls|7 years ago|reply
Do you find this a bit strange, given that the Pentagon is making a massive push toward (presumably private) could infrastructure?
https://www.reuters.com/article/us-usa-pentagon-cloud-idUSKB...
[+] [-] icedchai|7 years ago|reply
[+] [-] throwaway98121|7 years ago|reply
What you’ve seen in 10 years was reality during that time. That speake nothing little to the future.
[+] [-] throwanem|7 years ago|reply
[+] [-] nunez|7 years ago|reply
[+] [-] foozed|7 years ago|reply
[+] [-] tmaly|7 years ago|reply
[+] [-] pjc50|7 years ago|reply
Absolutely - if something is repetitive, it's a candidate for automation. This is true across all disciplines. Only the as-yet unautomatable human judgement, insight and communication is safely valuable.
On the other hand, "go away or I will replace you with a very small shell script" has been a BOFH joke since the 90s.
[+] [-] redleggedfrog|7 years ago|reply
The problem isn't the technology, it's the complexity of the customer's business requirements, and their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves. I would love to see more tooling to help with this. I have been waiting for 25 years. It gets better, but not nearly what can be described as an apocalypse.
[+] [-] F_J_H|7 years ago|reply
I've been using a fullstack low code development tool for several years now, and when it comes to developing CRUD apps or data reporting apps (with charts, interactive, drill down reports, etc. etc.), it's astonishing how quickly you can stand-up a secure, fully responsive web-app, complete with authentication, authorization schemes, report subscriptions, etc., without writing any code at all.
And, when you bump up against the limits of the declarative/low-code aspect of the framework, you can toggle over to java script, your own CSS, SQL, etc., so it's not like you paint yourself into a corner.
So, I agree, if Amazon creates something like this, and it is as good as some of the existing low-code tools out there, it's going to have a big impact over the long term.
edit: typo
[+] [-] adamc|7 years ago|reply
If there are IT jobs developing for smaller organizations, maybe those will go away, but... I think a lot of that disappeared already.
I'm close to retiring (from this job, anyhow), so it's not a personal issue for me. I just haven't seen it happening as described.
[+] [-] strikelaserclaw|7 years ago|reply
[+] [-] Spooky23|7 years ago|reply
The reality is, you're going to have a million monkeys hitting a million keyboards, and very few will be producing Shakespeare. All of that crap will be consuming lots and lots of AWS/Azure/etc bill.
You'll need way more IT people to rationalize it. There are tens of thousands of people in the United States whose purpose for the last decade has been re-implementing the 90s version of this in formal IT systems. You will have churn as we purge the legacy staff, especially windows click to admin types.
[+] [-] michaelbuckbee|7 years ago|reply
In web development this is most apparent (to me) in SAAS application development, where many/most of the underlying pieces of building a CRUD application that can scale to thousands of users, and be really functional are now provided by other SAAS apps which provide a _better_ service than the average developer can scrape together themselves.
Billing -> stripe.com over writing against the gateways directly
Database/Hosting -> Heroku PostGres/Redis and compute
Email -> Sendgrid, Mandrill, ActiveCampaign
Or even just SAAS frameworks like BulletTrain (Rails) or Laravel Spark which dramatically cut down on the boilerplate and integration code you'd have to write.
[+] [-] Ruxbin1986|7 years ago|reply
Sure, some of them will still have positions the same or similar roles but there will be a crunch. The large outsources will be hit overseas (WiPro, Infosys, etc.) but it will also impact administrators at medium-large sized businesses in typical American Cities as Forrest mentioned. The worst part out of all of this is too many colleges and especially technical colleges still teaching networking, linux or windows administration as if they'll be able to have life long career. That is no longer true.
I don't want to imagine what it'll be like for those students who graduate, get good jobs (now), a mortgage and start to raise their family only to find themselves unemployed in the middle of their lives. I don't expect much sympathy from the largely meritocratic tech industry or anyone else.
As for myself, I already work for one of the big three and apart of many "cloud" migrations. I should be okay but at the same time I am somewhat conflicted. Am I going to need to go back to school for Computer Science and become an fully-fledged actual software developer? I mean, it's fine, there's still enough time (I don't think we will really feel the burn for at least another 4-6 years) but is it reasonable or realistic that everyone needs to be rockstar developer?
[+] [-] forrestbrazeal|7 years ago|reply
[+] [-] kamaal|7 years ago|reply
The real problem with these ready-made plumb-and-plug modules is sooner or later these are either too slow, or expensive, or just a pain to refactor/redo. Eventually you just come back and realize you need a more granular control over things, and anything you are likely to come up with resembles a programming language.
I had this moment of realization myself while having to change a complicated graph in Pentaho Kettle a few months back. The graph looks bonkers hard and brittle, changing anything requires redoing all the dependent elements of the graph, and if you have a graph complicated enough you will be forced to rewrite it. The real trouble there is no functional/unit testing with these things. And then you realize, you are just better off with a full fledged ETL language/programming language. The second problem I faced was running into performance issues. Want to change the sort algorithm? Running into heap space issues? Want better logging? Want a better threading model? All the best. Nothing is possible.
This is above and beyond the need for meta-programming facilities. At that point whatever GUI graph you draw is worse than any verbose code you will write.
Regarding programmable tools, we already have those. Vim, Emacs, Microsoft Excel all give you a degree of meta control over the tool and what you want to do with it. But that's that, and it is often hard to bend this tools to your command.
These are just a few reasons why there won't be an apocalypse soon.
[+] [-] droobles|7 years ago|reply
The job market will close up a bit, but right now tech is looking like the California gold rush, where 4-5 years ago any bootcamp grad could jump right into a web dev job (at least in my job market in the Midwest). I think if you continuously learn and remain marketable as the times change, then as a worker you will be fine. I also like the comment that mentions that you may just end up working for the cloud provider rather than the business application company.
[+] [-] mark_l_watson|7 years ago|reply
I manage a machine learning team and I also think that at least partially automated data curation and modeling will reduce the number of people required in my field. It might take 5 or 10 years, but I think it will happen.
I think you are spot on that IT and devops will take a hit. I look more at Heroku’s model that AWS and GCP as the future. That said AWS and GCP will keep getting more ‘Heroku like’.
[+] [-] thrav|7 years ago|reply
It’s a complete blind spot for most engineering minded people because they never realized how flexible the platform was, and with Bret Taylor running the show now, it’s miles away from just being a clunky Sales CRM.
Couple of examples of recent developments:
https://developer.salesforce.com/blogs/2018/12/introducing-l...
https://lightningdesignsystem.com
https://developer.salesforce.com/platform/dx
[+] [-] jimbokun|7 years ago|reply
I mean, if it doesn't, what the hell are you even doing?
The whole point of technology and modern capitalism is to increase automation, increase the amount produced by the same number of workers, and increase the overall amount of wealth in the world and improve overall living conditions for everyone (setting aside very important questions of distribution). I just find it odd people in the computer technology industry find this shocking or especially worrying.
"I look more at Heroku’s model that AWS and GCP as the future."
Google's App Engine was much closer to the Heroku approach, and the AWS approach won. So I will be pretty surprised if the Heroku approach wins out.
[+] [-] DebtDeflation|7 years ago|reply
[+] [-] diego_moita|7 years ago|reply
Better automation has been "reducing the number of people required to deliver technical solutions" for ages.
Local Area Networks replaced many mainframe computers in the 80's. Optimized C compilers took the jobs of countless Assembly programmers. WordPress, Joomla and better web frameworks (Django, Rails) took the jobs of many Perl/Web developers. Python enabled a lot of people to do what FORTRAN/Java/C++ programmers were able to do before.
"Apocalypse" is just the normal state of affairs.
[+] [-] te_chris|7 years ago|reply
[+] [-] vinceguidry|7 years ago|reply
Companies are always going to follow the latest trends and it's always going to take smart people to follow them. I'm not worried about my ability to make a living. I just can't wait to see what comes.
[+] [-] sudofail|7 years ago|reply
A couple years ago I made the switch to full time development. I now do most of the DevOps stuff for my teams, but from a developer role, instead of a sysadmin/cloudops role.
I'm certain that's going to be the future. Look at Google's requirements for SREs. They are full-fledged software engineers.