I had a background of Unix and Windows NT. I did both ASP.Net and PHP and switched back and forth between contracts for about 6 years depending on who offered the most cash. I now sit at a Microsoft consultancy, on a Mac, doing all the high level technical design work and Unix bits the company needs (and occasionally digging the company out of the crap when they screw stuff up and the normal staff don't know what to do and Googling it is beyond them).
Did I do the right thing? No. I should have skipped Microsoft stack entirely. I say this after 12 years of .Net and a previous 6 years of COM/C++/Win32. It has been the universally most stressful platform to work on. The technology is terribly complicated, obtuse and non-orthogonal and full of dead-ends and surprises. It's also virtually impossible to cleanly automate anything which means you need thousands of hands to push the buttons in the right order[1]. However this pales in comparison to the culture which almost universally (90%+ of outfits I've worked at) contains solely grim factory farm programmers who don't actually care what they are doing or understand what happens outside the ecosystem and outsourcers who whilst technically superior to the on-site staff, march around with loaded guns ready to shoot their own toes off (and anyone else dancing with them).
The only good bit was the cash.
I'd rather have picked up a 2003 OsCommerce site than do it again.
Now I'm learning Objective C and Cocoa to solve a few real world problems in another non-technology market that IT rarely penetrates. I'm doing that on my own.
[1] This has finally improved, but a little too late with PS and PS DSC for example.
Can't automate anything? That's just false. We have our entire build/deploy/test process automated. Hint: even our server build is automated, and gets rebuilt with every code change.
I don't understand the premise of this article. The OP made one technology specialization choice 13 years ago, and all he wonders is whether it was the right one?
I know little about the author so I can't really judge, but shouldn't the question have been "what if I had learned something new and very different every year, instead of every 13"?
I'm going to go with the stereotypical response that he did learn something new and different every couple of years as Microsoft kept changing directions with their development stack.
To be fair, Microsoft's web stack has not had the dramatic changes in direction that their native stack has had over this time period. Still, the response from his peers when he tried to bring in ideas from Rails is telling, the ecosystem he chose isn't just a nice IDE and language.
OP here - I've made quite a few choices but this (as mentioned) was a pivotal one - and yes I wonder if I did the right thing. The MS ecosystem is very insular, so once you go down that road you're "all in" - so the choice wasn't a casual one at all.
Honestly, if your thirteen year old technology choices are still having an impact today, then you're doing the wrong thing regardless.
I started out with PHP, got whatever job I could which involved Access and VBA, then VB.NET, then C#... now I use JavaScript, Python and occasionally Ruby.
That isn't to say that you should know six different languages. But you should know programming well enough that you can apply your skills wherever you need to.
Every choice you make impacts you today. What you learned in high school, college, and at your first few jobs all shape who you are and what you know. If I wouldn't have spent those years in ASP.NET. I know programming, have the same skill set as you pretty much (no Python). I would have loved to be increasing my Ruby/JS skills during the "lost years" doing ASP.NET.
>Honestly, if your thirteen year old technology choices are still having an impact today, then you're doing the wrong thing regardless.
Really? If you don't randomly flit from fad to fad you are "doing the wrong thing"? What if I made a good choice 15 years ago? Should I have stopped using postgresql simply because I chose it so long ago?
Watching the things the ASP.NET team has accomplished inside such a politically poisoned organization as Microsoft over the last few years has been inspiring. I don't know how they managed to get things like "out of band" updates, open sourcing MVC, and so on accomplished in that environment. Makes me think there is hope for getting some similar changes through in my current "enterprise" environment.
Developing on the ASP.NET stack is nothing like it was in 2001, or even 2011. I used to loathe ASP.NET, but what MS has done has really emboldened the dev community to contribute some really great OSS solutions.
At my last job, we migrated from classic ASP to ASP.NET MVC, but we pretty much just left out all the horrible parts of ASP.NET (which there are, of course, many).
Basically if you nix webforms and .aspx files, and do your own templating (so essentially, just use ASP.NET for routing and controllers), it's not awful. Visual Studio is nice and C# is surprisingly decent as a webdev language. But yeah. I'd much rather use Python or Ruby.
I have been working on the MS stack for more than 14 years and still remember the decision I made to invest in learning the .NET platform. I certainly don’t regret my decision and maybe in hindsight I would have made some different choices along the way but being a .NET developer pays my bills and gives me plenty of satisfaction. For the last two years I've been doing more with the big three (JS, CSS3, HTML5) and I’m keeping my eyes on SignalR, Web and Mobile Development, AngularJS, and other "cool" technologies, but .NET is still paying my bills and I have no problem with that.
Why not do both? I did C/C++, then C#, and now C# plus PHP, JavaScript, or whatever is needed for the job. Sure it takes some ramp up, but not that much.
I wasn't a fan of the MS Web stack for a long time. ASP.NET MVC fixed that. It's still overly complicated but it gets better with each release.
From a language standpoint, it's night and day. C# is a real programming language. You can use it to write web apps, desktop apps, mobile apps, daemons/services--even embedded apps. PHP is a templating tool that has tried to evolve into a language and has been hampered in the process by horrible legacy. It's actually pretty decent now but it is still only useful for one thing: web apps.
I had to make this exact same decision in 2001 and I chose PHP!
I subsequently fell in love with Python and am still active with it today. I think a choice as far back as 2001 doesn't matter too much, programming is programming. However, I feel like I became a better programmer by having to deal with a language as shitty as PHP. There were so many pitfalls that I quickly became an expert at working around them -- this took a lot of painstaking investigation into the inner workings of the language.
>the Microsoft web world was (and to a large degree still is) all about "Visual Component Development". What that means is you basically do a lot of drag and drop
If were doing it wrong, then maybe. Allowing "VB6" programmers to drag-n-drop stuff to make forms is still a fine way for them to make applications that run on a high-speed connection and just happen to be using HTML by accident. Being able to use a "datasource" was apparently valuable for some people, although I could never get it to work well.
Was it a bad idea to only focus on that and force the control model on everyone? Sure. It wasn't necessarily a terrible option for a lot of people. I did demos of ASP.NET to PHP developers, and some of them were pretty blown away by the idea of having events fire and being able to do something like "foo.BackgroundColor = bla" and have the whole page "just work".
But being forced into the control model with viewstate and all that, it certainly didn't make ASP.NET competitive with other technologies for having total control over the experience.
Had MS gone "MVC" from the start and focused only one web techologies, and had someone else made the easy "forms for the web" toolkit and got marketshare, we'd be saying how foolish MS was for not providing a similar toolkit.
I had the same reaction to ASP.NET 1.0. "Do they have to make this thing so damn hard?" I had a PO system to write, and didn't have the time to fart around with bloated datagrid components and reams of cumbersome XML.
I had the luxury of being a 1-man IT department, so I gave up and learned Zope, which led me to Python, and then Django. Had ASP.NET MVC existed at the time, no doubt I would still be writing code in C#. In retrospect, I'm so glad it didn't.
i was about 16 or 17, and I sat in front of my freshly reformatted computer with a copy of windows NT4 and the latest version of slackware at the time. I was going to do a full partition install of one or the other ...
Even at the time I realized it was a crossroads. My entire career and future pretty much grew organically from that point outwards. I don't think I would have had even remotely the opportunities had I not committed myself to open source.
Rob did some interesting and influential things in the .NET community which stems from exactly what he wrote about: his work with Ruby and Rails.
I've come from the Microsoft (Windows 32-bit API) to .NET Fx to Linux/Unix world in much the same way as Rob; ASP.NET to Rails.
There is a certain reality I have learned over the years: when I consult part time and don't mind sometimes boring work I earn twice as much writing C# than Ruby. The money is in the enterprise.
Of course at my day job I write Scala and figure out ways to get the most amount of Ruby [or sometimes Python] I possibly can into our day to day work. I love optimizing for productivity and happiness.
I read the whole blog post disagreeing with almost everything right up until the end when I read:
"This is the one shining negative here: .NET stunted my knowledge of HTML/CSS/Javascript."
This is the whole point of the blog post. And this is in fact true. But terminology that is not pointed out is that ASP.NET WEBFORMS are what "stunted" your knowledge. WebForms made developers oblivious of how the web actually works. .NET, C#, or VB.NET isn't the problem.
Also, living in a Microsoft world can be insulating, but only if you aren't looking for a better way. "Alt.NET" was really a belief in outside the box thinking.
In 2001 - 2008 there was no ASP.NET "WEBFORMS". We had VS, CodeBehind, and Server Components - that was the world I lived in.
Yes, Alt.NET came along in 2009 or so and managed to do some good things, not soon enough really and all the silly flame wars actually did more harm than anything.
Javascript frameworks are going a long way to eliminate some of the worst parts of .NET development. The ability to design c# as a service and use api calls as hooks to the back end has helped act as a catalyst for change.
"...early on in my MVP/Community days (2007ish). Microsoft hired some non-traditional devs to head up something in the ASP.NET group. Phil Haack, Scott Hanselman, Rob Conery all came on board and launched ASP.NET MVC which BLEW. ASP.NET. UP."
First, asking yourself if you did the right thing is silly, you made the best decision with the information you had at the time. To be honest, technology ebbs and flows. What's hot right now may not be hot in the future. I think the key skill is learning how to learn, so you can learn a new technology quickly. Also, don't just focus on technology, understanding how a business operates is just as important!
No - go Java. ORMs, document generation, reporting, integration, architecture, deployment, tooling, devops stuff. Literally everything is ready to roll out of the box for $0. And it all works.
C# - half of this is available for a fee and the rest is available for a larger fee. And none of it works particularly well. And if anything breaks it's $200 a go to get it fixed.
Actualy I would think Java is a better technology to bet on if enterprise and neckties are your cup of tea. While things like AD, SQL Server and Windows are ubiquitous, there are lots and lots of companies that prefer Java over C#.
[+] [-] d0|12 years ago|reply
Did I do the right thing? No. I should have skipped Microsoft stack entirely. I say this after 12 years of .Net and a previous 6 years of COM/C++/Win32. It has been the universally most stressful platform to work on. The technology is terribly complicated, obtuse and non-orthogonal and full of dead-ends and surprises. It's also virtually impossible to cleanly automate anything which means you need thousands of hands to push the buttons in the right order[1]. However this pales in comparison to the culture which almost universally (90%+ of outfits I've worked at) contains solely grim factory farm programmers who don't actually care what they are doing or understand what happens outside the ecosystem and outsourcers who whilst technically superior to the on-site staff, march around with loaded guns ready to shoot their own toes off (and anyone else dancing with them).
The only good bit was the cash.
I'd rather have picked up a 2003 OsCommerce site than do it again.
Now I'm learning Objective C and Cocoa to solve a few real world problems in another non-technology market that IT rarely penetrates. I'm doing that on my own.
[1] This has finally improved, but a little too late with PS and PS DSC for example.
[+] [-] tatalegma|12 years ago|reply
[+] [-] skrebbel|12 years ago|reply
I know little about the author so I can't really judge, but shouldn't the question have been "what if I had learned something new and very different every year, instead of every 13"?
[+] [-] mutagen|12 years ago|reply
To be fair, Microsoft's web stack has not had the dramatic changes in direction that their native stack has had over this time period. Still, the response from his peers when he tried to bring in ideas from Rails is telling, the ecosystem he chose isn't just a nice IDE and language.
[+] [-] robconery|12 years ago|reply
[+] [-] untog|12 years ago|reply
I started out with PHP, got whatever job I could which involved Access and VBA, then VB.NET, then C#... now I use JavaScript, Python and occasionally Ruby.
That isn't to say that you should know six different languages. But you should know programming well enough that you can apply your skills wherever you need to.
[+] [-] robconery|12 years ago|reply
[+] [-] nousernamesleft|12 years ago|reply
Really? If you don't randomly flit from fad to fad you are "doing the wrong thing"? What if I made a good choice 15 years ago? Should I have stopped using postgresql simply because I chose it so long ago?
[+] [-] welshrats|12 years ago|reply
[+] [-] Finster|12 years ago|reply
[+] [-] outside1234|12 years ago|reply
[+] [-] overgard|12 years ago|reply
Basically if you nix webforms and .aspx files, and do your own templating (so essentially, just use ASP.NET for routing and controllers), it's not awful. Visual Studio is nice and C# is surprisingly decent as a webdev language. But yeah. I'd much rather use Python or Ruby.
[+] [-] icedog|12 years ago|reply
[+] [-] newsreader|12 years ago|reply
[+] [-] Todd|12 years ago|reply
I wasn't a fan of the MS Web stack for a long time. ASP.NET MVC fixed that. It's still overly complicated but it gets better with each release.
From a language standpoint, it's night and day. C# is a real programming language. You can use it to write web apps, desktop apps, mobile apps, daemons/services--even embedded apps. PHP is a templating tool that has tried to evolve into a language and has been hampered in the process by horrible legacy. It's actually pretty decent now but it is still only useful for one thing: web apps.
[+] [-] kumar303|12 years ago|reply
I subsequently fell in love with Python and am still active with it today. I think a choice as far back as 2001 doesn't matter too much, programming is programming. However, I feel like I became a better programmer by having to deal with a language as shitty as PHP. There were so many pitfalls that I quickly became an expert at working around them -- this took a lot of painstaking investigation into the inner workings of the language.
[+] [-] MichaelGG|12 years ago|reply
If were doing it wrong, then maybe. Allowing "VB6" programmers to drag-n-drop stuff to make forms is still a fine way for them to make applications that run on a high-speed connection and just happen to be using HTML by accident. Being able to use a "datasource" was apparently valuable for some people, although I could never get it to work well.
Was it a bad idea to only focus on that and force the control model on everyone? Sure. It wasn't necessarily a terrible option for a lot of people. I did demos of ASP.NET to PHP developers, and some of them were pretty blown away by the idea of having events fire and being able to do something like "foo.BackgroundColor = bla" and have the whole page "just work".
But being forced into the control model with viewstate and all that, it certainly didn't make ASP.NET competitive with other technologies for having total control over the experience.
Had MS gone "MVC" from the start and focused only one web techologies, and had someone else made the easy "forms for the web" toolkit and got marketshare, we'd be saying how foolish MS was for not providing a similar toolkit.
[+] [-] teilo|12 years ago|reply
I had the luxury of being a 1-man IT department, so I gave up and learned Zope, which led me to Python, and then Django. Had ASP.NET MVC existed at the time, no doubt I would still be writing code in C#. In retrospect, I'm so glad it didn't.
[+] [-] AdrianRossouw|12 years ago|reply
Even at the time I realized it was a crossroads. My entire career and future pretty much grew organically from that point outwards. I don't think I would have had even remotely the opportunities had I not committed myself to open source.
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] cfeduke|12 years ago|reply
I've come from the Microsoft (Windows 32-bit API) to .NET Fx to Linux/Unix world in much the same way as Rob; ASP.NET to Rails.
There is a certain reality I have learned over the years: when I consult part time and don't mind sometimes boring work I earn twice as much writing C# than Ruby. The money is in the enterprise.
Of course at my day job I write Scala and figure out ways to get the most amount of Ruby [or sometimes Python] I possibly can into our day to day work. I love optimizing for productivity and happiness.
[+] [-] dcomartin|12 years ago|reply
"This is the one shining negative here: .NET stunted my knowledge of HTML/CSS/Javascript."
This is the whole point of the blog post. And this is in fact true. But terminology that is not pointed out is that ASP.NET WEBFORMS are what "stunted" your knowledge. WebForms made developers oblivious of how the web actually works. .NET, C#, or VB.NET isn't the problem.
Also, living in a Microsoft world can be insulating, but only if you aren't looking for a better way. "Alt.NET" was really a belief in outside the box thinking.
[+] [-] robconery|12 years ago|reply
Yes, Alt.NET came along in 2009 or so and managed to do some good things, not soon enough really and all the silly flame wars actually did more harm than anything.
[+] [-] batoure|12 years ago|reply
[+] [-] UK-AL|12 years ago|reply
[+] [-] keithly|12 years ago|reply
"...early on in my MVP/Community days (2007ish). Microsoft hired some non-traditional devs to head up something in the ASP.NET group. Phil Haack, Scott Hanselman, Rob Conery all came on board and launched ASP.NET MVC which BLEW. ASP.NET. UP."
[+] [-] Delmania|12 years ago|reply
[+] [-] owenjones|12 years ago|reply
Yes?
[+] [-] robconery|12 years ago|reply
[+] [-] skrowl|12 years ago|reply
If you're going open source, you should have went python or node.
[+] [-] d0|12 years ago|reply
C# - half of this is available for a fee and the rest is available for a larger fee. And none of it works particularly well. And if anything breaks it's $200 a go to get it fixed.
[+] [-] rbanffy|12 years ago|reply
[+] [-] blt|12 years ago|reply