I'd love you know your thoughts on what's changed for developers in the last 20 years.
1. We moved from client apps to web apps to mobile apps.
2. Client server apps were for work and now there's a lot more "consumers" using apps
3. Hence, design and simplicity have become important.
4. We went from waterfall to agile programming.
5. There's a huge proliferation of "stacks" and "fragmentation" of languages for specialized tasks.What other trends do you see?
codeonfire|10 years ago
It's not about the work or profession any more. It's about people trying to optimize the shortest path to $150k/yr and has a certain game show feel to the industry. No one has any idea what they would be doing at the job. Many places don't even put you on a team immediately, and there is a sense that nothing else matters in the workplace/contest other than those skills used to 'crack' the interview. Degrees don't matter, and there's no industry or work to be done outside of web or mobile app development either.
Imagine today if mechanical engineering was solely focused on not designing, but making specific types of flanges. People would go to a flange boot camp then read 'cracking the flange making interview.' They would interview where they would sit at a lathe and make an honest-to-god flange. If it were a good flange you'd get a call back and a $100k salary with 50k in stock over three years to make flanges. You'd start making flanges on Monday. One day you ask yourself "why are we making flanges again? Aren't machinists supposed to be making the parts? Why are we not developing new CFD techniques or new packages for structural stress analysis?" Oh yeah, venture capital is only funding flange making.
adventured|10 years ago
Databases for web services used to really suck back in the 1995/2000 time frame. They were either difficult to use, or very expensive. MySQL gradually killed the bottom 2/3 of the commercial database market between ~1998 and ~2005 or so.
Simple interactivity on sites is incredibly easy now. Applets and Shockwave were horrible, due to performance issues (systems and browsers), general bugginess, and lack of consistent support.
RAM and storage used to be really, really expensive. When Excite wanted to test out how well the first version of their search engine could scale, they paid $10,000 for a 10gb hard drive (purchased by Vinod Khosla).
Using, controlling and embedding media on sites is almost an afterthought now, it's so easy. Real Media was satan.
Search was really bad until Google put AltaVista & Co out of their misery. Search spam ruled the day. The notion of just punching in a problem and finding a solution on Stack Overflow, ha.
Ten years ago, 20-50kb still mattered when loading a site. Developers no longer need to obsess about the size of their site assets (rather, more on how many and latency), unless they're getting really crazy about it.
Language options are so much better today than in 1995/2000. You can now choose from numerous good options, pick whichever one works best for you, and you can be confident that so long as you use it optimally, you're unlikely to run into big problems unless you're dealing with hyper scale services.
Collaboration capabilities are... well, drastically better now. Github, Slack, really easy and cheap video conferencing, large file storage & sharing, sites like Stack Exchange, social networks, and so on.
Bandwidth is dirt cheap; that doesn't really need an elaboration.
The countless cloud services have made prototyping / testing super cheap.
The rise of the internet service API, following fairly standardized approaches, has made interfacing with vast amounts of data very easy, very precise and very cheap.
richardbrevig|10 years ago
> Real Media was satan.
I remember Real Player wanting $2,000+ per server license to be able to stream content. It was a perk in different hosting accounts, some featured the ability to stream.
kls|10 years ago
richardbrevig|10 years ago
bayonetz|10 years ago
Also yes, my Camel book (Perl) got passed around quite a bit.
MrTonyD|10 years ago
MalcolmDiggs|10 years ago
The largest change I've noticed is the level of accessibility that programming has now. When I got started there were "Web Designers" who worked mostly in Photoshop and static HTML, and there were "programmers" who did something else, on special machines, with special training, but it was completely beyond the grasp of us mere mortals.
Nowadays programming is not something you need a degree or certification for, it's not something you need rigorous training for, or special equipment for, it's just something you can pick up if you feel like it (and get a taste for with little commitment or money or time spent). That's a huge change to me, and I think it's for the best. In this decade, we're seeing the definition of "literacy" expand, to include things like reading and writing programming languages...which is awesome!
hcarvalhoalves|10 years ago
I believe this is true since the 80's with cheap PCs running Basic, Pascal, etc.
ridiculous_fish|10 years ago
Two years later I was rocking with Project Builder, which came with OS X Public Beta - at no extra cost! I credit the release of OS X, and the huge usability advances in Linux, for making the GNU toolset accessible to hobbyists and students. This barrier lowering is what enabled the Cambrian Explosion of programming languages we enjoy today.
1: http://www.macworld.com/article/1014313/19reviewscodewarrior...
matt_s|10 years ago
From my perspective it is the collaboration and depth of answers you can find online now. When I was programming 20 years ago I was in college and "the web" really didn't exist - there was email and networks, etc. but web browsing hadn't really begun mainstream yet. When I needed to learn something I had to find a book on it or actually attend a class - like programming EJB's in J2EE.
An example about frameworks/libraries: building web applications was very different since you had Netscape and IE (v4) supporting different and overlapping HTML, CSS and JS features. MVC and all the frameworks out now were really in their infancy or didn't exist 20 years ago. I was doing CGI scripts with Perl for some apps and there was a lot of heavy lifting compared to today.
Today you can probably have a scaffold/basic application (thinking in Rails) that is OS and device independent with something like bootstrap, running on a VM, on the internet in about a hour. So it's much quicker to get to the point where you start adding value rather then spending enormous amounts of time on basic infrastructure.
RogerL|10 years ago
This may not matter to a lot of you, but computation has made a huge difference. I can run nonlinear solvers in seconds instead of hours. I have ready access to things like NumPy and SciPy to very quickly explore data in a way that used to require a supercomputer.
hardware = this is partly covered by 'computation' but the kind of work I do would have been impossible back then. We generate 10+ terabytes of data a day, and that is laughably small to some of you. some more from hardware...
bits. Anyone remember trying to fit things in 32K? You know, because integers were 16 bits long, and so were pointers. It was a major chore to just sort or otherwise manipulate data, and I made some significant wins by being clever about this sort of thing rather than just saying 'eff it, use a database' (which is what my competitors did).
5 minute compiles for rather small applications.
Sneakernet.
minimal version control. A lot of people didn't know what it was.
Off the hardware front, you mostly had the ability to understand your 'stack' (that word wasn't used that way in those days). You could more of less know the Windows API, inspect the assembly coming off of your C compiler, read the standard library, and pretty much have a picture of what is going on. These days it is much harder.
To put some of this in context. My first job, in 1988, consisted of computing cancer statistics. I'd carefully write a batch job. It'd go off to a supercomputer center. There, operators would (eventually) pull tapes off the racks and insert them into the tape reader. The job would run (eventually). The job would print to printers. A van would make runs several times during the day, delivering print outs. I'd go check the bins, eventually my print out would show up. And so on. Part of my work was converting that to the PC, but we still had to do a lot of batch runs to massage the "huge" data inputs that the PC couldn't realistically handle. By '95 the situation re data handling was better, but still quite limited both in storage and computational capacity.
These days we can trivially handle pretty big problems on a PC. And if we can't handle a problem, well, there is AWS. It is just a different world in this regard.
imakesnowflakes|10 years ago