I dunno. On the one hand I hate “web dev” more than anyone. I think it has led to such an astronomical decline in software quality that if you described it to someone from the days when computers were 1000x slower, they straight up wouldn’t believe you.
That said… the article doesn’t really ring true to me. What he is saying about the complexity of each part of the stack (http, html/dom, css) is technically true, but that’s not really how it washes out in practice. This whole “CSS is a complex graphics engine!” “HTTP is a protocol you could write a whole dissertation about!” sounds like an argument being made by someone trying to make a rhetorical point about web. In practice for most of web dev you don’t need to understand the deep nuances of CSS or HTTP or whatever. Yes, there is a large breadth of material you have to learn but the depth you actually need in any one area is much less than the author is trying to imply.
And yes, web is trash, but for different reasons. In fact some of those reasons are the opposite of what the author is saying. He says that each part of the stack is so complex it should be a separate specialty. But the real problem is the very fact that things are so complex. Rather than accept that complexity and subdivide the field into different disciplines, we should get rid of all this unneeded complexity to begin with.
He does also point out that CSS Grid or html tables havent changed. The web still mostly works the same.
You are yet another perfect example of raw antagonism against the web, a body of hate. You are legion. But, if we look at the arguments here, look at where complexity dwells, the things that are hard and changing aren't the fundamentals, aren't the essentials. They are not so complex.
What is hard/changing is state management. What is hard/changing is handling state in client-server or other connected architectures. What is hard/changing is being smart about offloading work to threads. And it's not like anyone else has conquered this complexity. None of the other ecosystems are particularly far in advance. The complexity of these cases seems to be inherent, not accidental.
The reason for so much complexity is because we change & improve & progress. This makes some people very upset. People drastically over-ascribe the woes of the software development world to the web, when really it's just that the web is now the default place for making software & most companies would bungle up these concerns no matter what platform they were building atop.
> On the one hand I hate “web dev” more than anyone. I think it has led to such an astronomical decline in software quality that if you described it to someone from the days when computers were 1000x slower, they straight up wouldn’t believe you.
Nearly all of this, IMO, can be explained by a lack of passion.
I grew up on computers, starting in the 90s. I didn't have internet access until near the end of the decade, and it was slow dial-up. If you broke the family computer, you had to figure out not only how you had broken it, but how to fix it. When I found Linux (Gentoo, obviously, because it's way more fun to spend days tweaking CLFAGS than to use the software), I was also thrust into forum culture, which was rife with RTFM. You quickly learn either to search and read docs, and demonstrate a modicum of capability and effort, or you lose interest and do something else.
This is not the case now. Even before the advent of LLMs, it wasn't that hard to find the answer to most of your questions on SO. The rise of cloud computing means that you don't have to know how to run a computer, you just have to know how to talk to an API – and even then, only at a surface level. You can pretend that TCP doesn't exist, have no idea how disks magically appear on-demand (let alone what a filesystem is), etc. Databases, which used to be magical black boxes that you accessed via queries carefully written by wizened graybeards, are now magical black boxes that you abuse with horrifying schema and terrible queries. Worse, you don't even have to know their lingua franca, and can commit your crimes via an ORM, which probably sucks.
And for all of this abstraction, you are paid handsomely. The difficulty in landing your first job is tremendously high, sure, but the payoff is enormous. Once you're in, you'll find that the demands of most businesses is not to upskill, but to push features out faster. Grow the user base, beat others to market, and dazzle VCs. No one has time for doing things right, because that slows down velocity.
This is aided and abetted by Agile, specifically Scrum. Aside from maintaining the cottage industry of Agile consultancies, it's designed to turn software production into a factory, where no one person really needs to know how to do anything tremendously complex. Instead of insisting that people learn how to do difficult things, we spend hours per week breaking down tasks into bite-sized increments with absurd abstractions of time.
"Thought leaders" deserve a callout here as well for their contributions to this mess. Microservices are a great example. A potentially useful architecture in some circumstances has been turned into gospel that people buy into without question, and apply everywhere regardless of its utility or fit. If you're lucky, someone eventually notices that rendering a page seems to take a lot longer than it should, but more often than not this is met with a shrug, or at best blaming the DB and paying AWS for a larger instance. Multiple network calls that each have to traverse N layers of routing and queues is slower than calls within a single process? Color me surprised.
When you combine the allure of a high-paying job that has little barrier to entry with no business incentives to do things differently, you get what we have today.
Yes it sounds nice in theory but there’s so many things to learn both in front and back end parts that I’m yet to meet a true “full stack web dev” even if they all claim to be.
Yes a guy who’s done backend all his career can write some basic HTML and CSS and some JS. And some frontend guy can write a simple server side code that writes and reads from a datastore. But they’re not “full stack” in my eyes, there had to be some balance in terms of knowledge in both areas; 60-40 would be ok, but 90-10 is not.
When I joined the company I work for over a decade ago there were backend guys doing frontend; yes they delivered something but frontend quality was poor. Now it’s the opposite, frontend guys doing backend (and of course they don’t want to deal with SQL so NOSQL it is); same thing.
Nothing beats a frontend and a backend dev (or multiple) working in tandem, IMO.
It's not remotely a "trend" though. Full-stack is how it has always been for many web developers since 1994. I almost have never been anything other than a full-stack developer, one way or another.
I did work at a dot com (well a dot co dot uk) back in the 90s where I had varying jobs, and arguably the most successful two I was the front end guy for (but these were successful because of who they were for, not the development; the back-end was a nightmare in one of these). We had to invent things.
Apart from that, I've always just done everything. And I'm good at it all. Slightly conservative or risk-averse after almost three decades, maybe. But still good, and still up to date and learning.
And I'm burned out and want to quit, or get away from the Web, or at least teach (which I've also done).
I don't think newer web developers necessarily understand the luxury of specialising in one part or another. A lot of us didn't get offered the choice.
(But then again, I'm shocked by how many newer developers lack basic competence that I think only comes from deeper understanding of the full stack. There are non-idempotent GET requests on this very website where I am typing.)
ETA: I think in a lot of small shops, developers still end up getting dragged across this divide through circumstance. The web does not really have a front-end/back-end divide, no matter how much recruitment managers, engineering team leads and tech bloggers would like it to have.
Like another commenter here, I've been full stack since my first working day, and to this day, 1.5 decades in the industry. I've always touched on infrastructure, on the database, on the server side and on the client side. Vertical implementations of features.
I can't begin to imagine life as just a ... database guy, or a backend guy or just a frontend guy. Perhaps it needs to be aligned in everyone's eyes that we cannot be as good at databases as one that spends most of his time doing databases. But there are pros and cons to our kind of knowledge.
I can argue pro and con SQL vs. NOSQL, to the limits of my ability, or argue for this frontend framework or that, or consider various languages or architectures for the backend, or discuss about how we'll deploy the production version of whatever it is that we're building, or how we'll do development side CI/CD and so on and so forth. What am I? I'm open to the idea that I'm a fraud, but I like to consider myself a full stack.
I have been saying this for years. The opinion has become even firmer since moving into DB specialization – the horror show of schemata and queries that even dedicated backend teams dream up is unreal. I don’t even really blame them; relational databases are hard. Yes, anyone can install Postgres (or spin up a managed service) and get decent results for quite some time, but at scale you absolutely have to know what you’re doing. There is a reason that DBAs were a thing, and the SaaS industry is slowly realizing that they shouldn’t have abandoned that role.
Fullstack is not much of a trend. The word has been around for awhile but when you go look at the job market you can see that the industry doesn't buy these terms very much, and that they also price that way too. A fullstack salary might only eventually catch up to frontend or backend.
Also, backend is a far more ambiguous term than frontend. With frontend I'd almost want to ask if the next thing you're going to say is React. On the backend? No idea. Do you deal with distributed synchronization as your only specialty? Do you do billion-per-second event logging, warehousing, and querying? Do you write Kafka glue all day?
So when people say fullstack they really mean app making, and however much frontend, backend, or even janitorial work is required to make an app exist.
The frontend is just an eventually consistent node in a distributed system. Fullstack means you have a basic understanding of how the entire system works which is invaluable.
At the risk of sticking my head above the parapet, I've made a career doing both to a pretty high standard. Not just that also server admin, DBA, desktop app development and a pile more skills I've picked up from a quarter century doing this stuff. When you're not working for big companies, you have to fill the gaps.
Sorry you feel the people you've worked with are so inadequate, but these are muscles you need to exercise. If they have no opportunities to learn at work, of course they're not punching at the same weight. But there are plenty of us that do.
I don't often announce myself as a full stack dev though. Maybe that's the difference between me and your experience.
I disagree completely, the best are teams who are good at both. Separate teams of front and back are significantly slower and always seem to be at odds with each other. Teams should be empowered to build out a working feature without needing to coordinate with a second team's roadmap.
Please tell us why you think it's harmful. I don't see the harm in "too much to learn." That just sounds like software work to me.
When people specialize, the different specialties need to show their value, so they find greater complexity in their niche. The different specialties eventually independently "discover" many of the same concepts but explain them differently and their languages even diverge. I think losing the ability to share is a kind of harm.
I don't fully disagree, but this might be subjective.
For me as a full stack developer working with small teams/startup, I actually don't consider myself full stack. I just want to be able to do whatever it takes to make a product and ship features. Does it need a websocket server? I'll learn how to do that. Does it need advanced client side caching? I can do that, etc.
To some extent "product development" is both art and engineering. In the engineering side, you can think of html and CSS and http and testing as different things requiring a multidisciplinary team.. but if you think just in terms of "building the thing", I like to feel that I can get it done with whatever technology needed. That's why I got into programming in the first place. Not to write code and be an "engineer", but to make the computer do cool things.
AI does expand the capabilities of someone that wants to get things done. I have written in languages that I don't have experience with, and recently was exploring a neo4j db with cypher queries written with ChatGPT (I have only MongoDB experience), something that would've taken me hours to learn.
Still, having experts in specific areas in a team can be very helpful (both to get technically difficult things done but also to have others learn from them).
I just don't want to be 1 part of a team specializing in my limited domain, where I start to care more about the technical part than the actual value delivered or user experience..
I think what might cause me to burnout is too much specializing, too much bureaucracy, people telling me I can't do this, we need to hire a staff level person to this thing etc..
> Framework skills are perishable, but are easily just as complicated as the foundation layers of the web platform and it takes just as much – if not more – effort to keep them up to date.
That's so true. For my own projects I try to not use any framework at all (or sometimes still use Backbone, which is entirely deprecated but simple, and that I know well enough).
But of course, as an employee many times you don't have a choice. I was recently part of an Angular team (Angular 2). That was one of my most unpleasant experiences. Angular seems to revel in complexity for complexity's sake. And it's often not needed at all. In that case it was used for displaying information that lives server side and is constantly updated there (live inventory). Why would they need a big client for that?
I feel like the deskilling of web dev is that the web dev in this article doesn't feel competent enough to learn HTML, CSS _and_ Javascript at the same time.
I'm curious about what the author of this article expects? That all APIs remain frozen in time forever? That new APIs are only something that happens in the frontend?
> The framework knowledge itself is also perishable. Not because your memory or physical coordination deteriorates (though that happens too), but because frameworks change more and faster than the underlying platform.
> But the React skills I have are all out of date and obsolete. I would effectively have to start from scratch even if I wanted to get back into React work. Everything React has changed in ways that are fundamentally incompatible.
I'm lost as to what the author is talking about. I've been using React for... 9-10 years now. There's been two API changes to React in that time - functional components and hooks. Neither of these are rocket science, and they're also able to be completely ignored and you can stick to your existing knowledge if you so chose to. I don't feel like 2 API changes in 10 years is that radical.
Find a few good tools, understand why they're good, use them extensively and pick up others when needed (i.e. when you switch jobs or face a new problem).
As a backend guy I uhh.. have long been of the opinion that web dev was always on the slippery slope of deskilling.
I remember working on a project where I was lone backend guy doing data storage/retrieval/aggregation/caching/entitelement/etc all behind a discoverable API that fed the UI everything it needed in a handful of calls.
Meanwhile the web dev guy took 6 weeks of iteration to create a date selector that wasn't awful.
Felt like they were too busy getting stuck gluing together frameworks they didn't understand and thus couldn't integrate well, and not simply writing a little code.
When he finally got it working, it was still awful. You had to select start date, start time, end date, end time.. and if you didn't proceed in the prescribed order the other selector boxes would reset and go wonky, lol.
Sounds like you worked at squarespace... The date picker of their events creation feature works just like that, among other problems with their date selection support like not handling recurring dates...
My take on this is that we have multiplied the number of potential ways to build a web application continuously for decades. And now there are a combined 10,000 (pick a number) viable (but not trendy) ways to make a web application.
The interesting thing to me is that ultimately businesses don't care how it works. They just want it to work.
Which means that you can pick a small framework or two across the frontend and backend and configure or train an AI system on only a tiny fraction of the sum total of web development knowledge and have an effective automated web development system.
Looking at the trajectory of gpt4 to gpt-4o and Llama2 to llama3, the prevalence of multimodality, improved reasoning ability as models get bigger, strong investment in hardware research, etc.
I've been doing web development in some capacity since the late 90s and focused on leveraging generative AI for the last two years. I don't see how any reasonable person can follow this stuff closely and not anticipate AI systems that can literally do the entire job of a small web development team, within just a few years. It was actually possible to build a version of that two years ago, and some of the latest attempts are very polished, if lacking in some level of functionality. But that is coming.
Every single job that we have today will be automated. I assume they means that people will be left basically herding swarms of AI agents. For a few years. But it won't be very long before you really need an AI to control your agent swarms or even understand what they are doing.
A honest question. Could someone tell why is CSS getting more and more complex (other than Google and Co. want to protect their browser's market share and hold control of it)?
It's not getting more complex, because the easy stuff you have always been able to do is still there the exact same, as easy as it's always been. There is more of CSS now, so you have more power to express styles for more mediums in better ways, but you only need to use what you want so you don't need to make things complicated at all.
Every client should have the delivered site go through pagespeed.web.dev and when there are 4 green circles around the four 100s the webdesigner gets paid provided the client likes the site. This is not an OR gate
Hard to sympathise. Generalizes like mad. Stop being a full stack. Have something you do well and you should have little problems in this industry. There's so much work an opportunities.
Many of them suck though. I get hired as a troubleshooter usually: that makes the most and it’s nice, get to see many companies etc. I work with people who are hired for frontend, backend etc; a month or so in, they will be asked things like ‘so devops, is that something you can do?’ Etc. And that is if it’s not in the actual job offer; ‘frontend expert wanted with 4000 years experience in everything’. I usually leave after 3-6 months as I ask a lot of money, but I have been asked to advice and help on Windows upgrades, printer emergencies and more.
It starts off as articulate and then, I felt the arguments were less clear.
It seems perhaps ungrounded. It's nice they have energy in a way but it also seems vaguely conspiratorial, rather than emergent. I've nothing against people seeking a better lot in life and collaborating or pointing out typical flaws and rent seeking. I wish the perspective was easier to follow, I'm curious.
[+] [-] brhsagain|1 year ago|reply
That said… the article doesn’t really ring true to me. What he is saying about the complexity of each part of the stack (http, html/dom, css) is technically true, but that’s not really how it washes out in practice. This whole “CSS is a complex graphics engine!” “HTTP is a protocol you could write a whole dissertation about!” sounds like an argument being made by someone trying to make a rhetorical point about web. In practice for most of web dev you don’t need to understand the deep nuances of CSS or HTTP or whatever. Yes, there is a large breadth of material you have to learn but the depth you actually need in any one area is much less than the author is trying to imply.
And yes, web is trash, but for different reasons. In fact some of those reasons are the opposite of what the author is saying. He says that each part of the stack is so complex it should be a separate specialty. But the real problem is the very fact that things are so complex. Rather than accept that complexity and subdivide the field into different disciplines, we should get rid of all this unneeded complexity to begin with.
[+] [-] jauntywundrkind|1 year ago|reply
You are yet another perfect example of raw antagonism against the web, a body of hate. You are legion. But, if we look at the arguments here, look at where complexity dwells, the things that are hard and changing aren't the fundamentals, aren't the essentials. They are not so complex.
What is hard/changing is state management. What is hard/changing is handling state in client-server or other connected architectures. What is hard/changing is being smart about offloading work to threads. And it's not like anyone else has conquered this complexity. None of the other ecosystems are particularly far in advance. The complexity of these cases seems to be inherent, not accidental.
The reason for so much complexity is because we change & improve & progress. This makes some people very upset. People drastically over-ascribe the woes of the software development world to the web, when really it's just that the web is now the default place for making software & most companies would bungle up these concerns no matter what platform they were building atop.
[+] [-] sgarland|1 year ago|reply
Nearly all of this, IMO, can be explained by a lack of passion.
I grew up on computers, starting in the 90s. I didn't have internet access until near the end of the decade, and it was slow dial-up. If you broke the family computer, you had to figure out not only how you had broken it, but how to fix it. When I found Linux (Gentoo, obviously, because it's way more fun to spend days tweaking CLFAGS than to use the software), I was also thrust into forum culture, which was rife with RTFM. You quickly learn either to search and read docs, and demonstrate a modicum of capability and effort, or you lose interest and do something else.
This is not the case now. Even before the advent of LLMs, it wasn't that hard to find the answer to most of your questions on SO. The rise of cloud computing means that you don't have to know how to run a computer, you just have to know how to talk to an API – and even then, only at a surface level. You can pretend that TCP doesn't exist, have no idea how disks magically appear on-demand (let alone what a filesystem is), etc. Databases, which used to be magical black boxes that you accessed via queries carefully written by wizened graybeards, are now magical black boxes that you abuse with horrifying schema and terrible queries. Worse, you don't even have to know their lingua franca, and can commit your crimes via an ORM, which probably sucks.
And for all of this abstraction, you are paid handsomely. The difficulty in landing your first job is tremendously high, sure, but the payoff is enormous. Once you're in, you'll find that the demands of most businesses is not to upskill, but to push features out faster. Grow the user base, beat others to market, and dazzle VCs. No one has time for doing things right, because that slows down velocity.
This is aided and abetted by Agile, specifically Scrum. Aside from maintaining the cottage industry of Agile consultancies, it's designed to turn software production into a factory, where no one person really needs to know how to do anything tremendously complex. Instead of insisting that people learn how to do difficult things, we spend hours per week breaking down tasks into bite-sized increments with absurd abstractions of time.
"Thought leaders" deserve a callout here as well for their contributions to this mess. Microservices are a great example. A potentially useful architecture in some circumstances has been turned into gospel that people buy into without question, and apply everywhere regardless of its utility or fit. If you're lucky, someone eventually notices that rendering a page seems to take a lot longer than it should, but more often than not this is met with a shrug, or at best blaming the DB and paying AWS for a larger instance. Multiple network calls that each have to traverse N layers of routing and queues is slower than calls within a single process? Color me surprised.
When you combine the allure of a high-paying job that has little barrier to entry with no business incentives to do things differently, you get what we have today.
[+] [-] Xenoamorphous|1 year ago|reply
Yes it sounds nice in theory but there’s so many things to learn both in front and back end parts that I’m yet to meet a true “full stack web dev” even if they all claim to be.
Yes a guy who’s done backend all his career can write some basic HTML and CSS and some JS. And some frontend guy can write a simple server side code that writes and reads from a datastore. But they’re not “full stack” in my eyes, there had to be some balance in terms of knowledge in both areas; 60-40 would be ok, but 90-10 is not.
When I joined the company I work for over a decade ago there were backend guys doing frontend; yes they delivered something but frontend quality was poor. Now it’s the opposite, frontend guys doing backend (and of course they don’t want to deal with SQL so NOSQL it is); same thing.
Nothing beats a frontend and a backend dev (or multiple) working in tandem, IMO.
[+] [-] ecjhdnc2025|1 year ago|reply
I did work at a dot com (well a dot co dot uk) back in the 90s where I had varying jobs, and arguably the most successful two I was the front end guy for (but these were successful because of who they were for, not the development; the back-end was a nightmare in one of these). We had to invent things.
Apart from that, I've always just done everything. And I'm good at it all. Slightly conservative or risk-averse after almost three decades, maybe. But still good, and still up to date and learning.
And I'm burned out and want to quit, or get away from the Web, or at least teach (which I've also done).
I don't think newer web developers necessarily understand the luxury of specialising in one part or another. A lot of us didn't get offered the choice.
(But then again, I'm shocked by how many newer developers lack basic competence that I think only comes from deeper understanding of the full stack. There are non-idempotent GET requests on this very website where I am typing.)
ETA: I think in a lot of small shops, developers still end up getting dragged across this divide through circumstance. The web does not really have a front-end/back-end divide, no matter how much recruitment managers, engineering team leads and tech bloggers would like it to have.
[+] [-] ThalesX|1 year ago|reply
I can't begin to imagine life as just a ... database guy, or a backend guy or just a frontend guy. Perhaps it needs to be aligned in everyone's eyes that we cannot be as good at databases as one that spends most of his time doing databases. But there are pros and cons to our kind of knowledge.
I can argue pro and con SQL vs. NOSQL, to the limits of my ability, or argue for this frontend framework or that, or consider various languages or architectures for the backend, or discuss about how we'll deploy the production version of whatever it is that we're building, or how we'll do development side CI/CD and so on and so forth. What am I? I'm open to the idea that I'm a fraud, but I like to consider myself a full stack.
[+] [-] sgarland|1 year ago|reply
[+] [-] threatofrain|1 year ago|reply
Also, backend is a far more ambiguous term than frontend. With frontend I'd almost want to ask if the next thing you're going to say is React. On the backend? No idea. Do you deal with distributed synchronization as your only specialty? Do you do billion-per-second event logging, warehousing, and querying? Do you write Kafka glue all day?
So when people say fullstack they really mean app making, and however much frontend, backend, or even janitorial work is required to make an app exist.
[+] [-] brigadier132|1 year ago|reply
[+] [-] oliwarner|1 year ago|reply
Sorry you feel the people you've worked with are so inadequate, but these are muscles you need to exercise. If they have no opportunities to learn at work, of course they're not punching at the same weight. But there are plenty of us that do.
I don't often announce myself as a full stack dev though. Maybe that's the difference between me and your experience.
[+] [-] JackMorgan|1 year ago|reply
[+] [-] drewcoo|1 year ago|reply
Please tell us why you think it's harmful. I don't see the harm in "too much to learn." That just sounds like software work to me.
When people specialize, the different specialties need to show their value, so they find greater complexity in their niche. The different specialties eventually independently "discover" many of the same concepts but explain them differently and their languages even diverge. I think losing the ability to share is a kind of harm.
[+] [-] yousif_123123|1 year ago|reply
For me as a full stack developer working with small teams/startup, I actually don't consider myself full stack. I just want to be able to do whatever it takes to make a product and ship features. Does it need a websocket server? I'll learn how to do that. Does it need advanced client side caching? I can do that, etc.
To some extent "product development" is both art and engineering. In the engineering side, you can think of html and CSS and http and testing as different things requiring a multidisciplinary team.. but if you think just in terms of "building the thing", I like to feel that I can get it done with whatever technology needed. That's why I got into programming in the first place. Not to write code and be an "engineer", but to make the computer do cool things.
AI does expand the capabilities of someone that wants to get things done. I have written in languages that I don't have experience with, and recently was exploring a neo4j db with cypher queries written with ChatGPT (I have only MongoDB experience), something that would've taken me hours to learn.
Still, having experts in specific areas in a team can be very helpful (both to get technically difficult things done but also to have others learn from them).
I just don't want to be 1 part of a team specializing in my limited domain, where I start to care more about the technical part than the actual value delivered or user experience..
I think what might cause me to burnout is too much specializing, too much bureaucracy, people telling me I can't do this, we need to hire a staff level person to this thing etc..
[+] [-] bambax|1 year ago|reply
That's so true. For my own projects I try to not use any framework at all (or sometimes still use Backbone, which is entirely deprecated but simple, and that I know well enough).
But of course, as an employee many times you don't have a choice. I was recently part of an Angular team (Angular 2). That was one of my most unpleasant experiences. Angular seems to revel in complexity for complexity's sake. And it's often not needed at all. In that case it was used for displaying information that lives server side and is constantly updated there (live inventory). Why would they need a big client for that?
[+] [-] bdlowery|1 year ago|reply
Aka “I like living in 1999”
Lol
[+] [-] dj_mc_merlin|1 year ago|reply
[+] [-] mouzogu|1 year ago|reply
web dev does feel more like cheap labor work tbh. aside from a few exceptions. mostly tedious grunt work.
[+] [-] madeofpalk|1 year ago|reply
> The framework knowledge itself is also perishable. Not because your memory or physical coordination deteriorates (though that happens too), but because frameworks change more and faster than the underlying platform.
> But the React skills I have are all out of date and obsolete. I would effectively have to start from scratch even if I wanted to get back into React work. Everything React has changed in ways that are fundamentally incompatible.
I'm lost as to what the author is talking about. I've been using React for... 9-10 years now. There's been two API changes to React in that time - functional components and hooks. Neither of these are rocket science, and they're also able to be completely ignored and you can stick to your existing knowledge if you so chose to. I don't feel like 2 API changes in 10 years is that radical.
[+] [-] calderwoodra|1 year ago|reply
It's really not that hard...
[+] [-] Havoc|1 year ago|reply
>HTML
>JavaScript
> in a sensible industry, would each be a dedicated field.
I get that webdev is a maze of frameworks, but that's just ridiculous.
[+] [-] threatofrain|1 year ago|reply
[+] [-] steveBK123|1 year ago|reply
I remember working on a project where I was lone backend guy doing data storage/retrieval/aggregation/caching/entitelement/etc all behind a discoverable API that fed the UI everything it needed in a handful of calls.
Meanwhile the web dev guy took 6 weeks of iteration to create a date selector that wasn't awful.
Felt like they were too busy getting stuck gluing together frameworks they didn't understand and thus couldn't integrate well, and not simply writing a little code.
When he finally got it working, it was still awful. You had to select start date, start time, end date, end time.. and if you didn't proceed in the prescribed order the other selector boxes would reset and go wonky, lol.
[+] [-] BehindBlueEyes|1 year ago|reply
[+] [-] ilaksh|1 year ago|reply
The interesting thing to me is that ultimately businesses don't care how it works. They just want it to work.
Which means that you can pick a small framework or two across the frontend and backend and configure or train an AI system on only a tiny fraction of the sum total of web development knowledge and have an effective automated web development system.
Looking at the trajectory of gpt4 to gpt-4o and Llama2 to llama3, the prevalence of multimodality, improved reasoning ability as models get bigger, strong investment in hardware research, etc.
I've been doing web development in some capacity since the late 90s and focused on leveraging generative AI for the last two years. I don't see how any reasonable person can follow this stuff closely and not anticipate AI systems that can literally do the entire job of a small web development team, within just a few years. It was actually possible to build a version of that two years ago, and some of the latest attempts are very polished, if lacking in some level of functionality. But that is coming.
Every single job that we have today will be automated. I assume they means that people will be left basically herding swarms of AI agents. For a few years. But it won't be very long before you really need an AI to control your agent swarms or even understand what they are doing.
[+] [-] thih9|1 year ago|reply
Speak for yourself, me and my team use a CI and always add or fix tests right before merging any PR.
(this is a joke, test suite is most useful when it’s part of the dev process and not an afterthought as I’m implying above)
[+] [-] brigadier132|1 year ago|reply
> These are all distinct specialities and web dev teams should be composed of cross-functional specialists.
I completely disagree and also thankfully this will never happen because it's completely impractical.
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] zx8080|1 year ago|reply
[+] [-] brigadier132|1 year ago|reply
[+] [-] err4nt|1 year ago|reply
[+] [-] konfusinomicon|1 year ago|reply
[+] [-] onion2k|1 year ago|reply
[+] [-] mediumsmart|1 year ago|reply
Every client should have the delivered site go through pagespeed.web.dev and when there are 4 green circles around the four 100s the webdesigner gets paid provided the client likes the site. This is not an OR gate
[+] [-] isaacremuant|1 year ago|reply
[+] [-] anonzzzies|1 year ago|reply
Many of them suck though. I get hired as a troubleshooter usually: that makes the most and it’s nice, get to see many companies etc. I work with people who are hired for frontend, backend etc; a month or so in, they will be asked things like ‘so devops, is that something you can do?’ Etc. And that is if it’s not in the actual job offer; ‘frontend expert wanted with 4000 years experience in everything’. I usually leave after 3-6 months as I ask a lot of money, but I have been asked to advice and help on Windows upgrades, printer emergencies and more.
[+] [-] luke-stanley|1 year ago|reply
[+] [-] unknown|1 year ago|reply
[deleted]