top | item 32539426

(no title)

734129837261 | 3 years ago

Back in 2001, I started work as a web-developer right out of high school at the age of 17. My high school diploma wasn't good enough to get into a software engineering university in my country (the Netherlands), so I had to wait until I was 21 to take an admission-test.

So I worked for 4 years before I got to a university and followed along for a 1-day introduction. They would tell their prospective students what they would learn in the next 4 years, and what jobs they would find when they were done. At the end of the day was a Q&A with some professors.

It was at that moment that I realised: 1. I know more than these professors do; 2. I'm currently a very skilled autodidact software developer; 3. I already know all of what they would teach me in four years; 4. they were working with outdated materials; they taught generics, not specifics.

These professors were academics. Google didn't exist yet. They, mostly, hadn't worked in any professional environment. They weren't pragmatic. They were slow perfectionists but also several years behind on the rest of the world.

And that was saying something: the bleeding-edge books that I was reading took at least 1 year from the start of writing to publication, so even I was behind on reality.

Even today I sometimes wonder what software engineering students learn in 4 or more years. It shouldn't take nearly that long. If you spend 20 hours a week studying software engineering you should be ready to find work in less than a year. And from that point onward, that's where you actually learn how to do it right.

discuss

order

yolovoe|3 years ago

In college, I was able to sample a lot of computer science from building a pipelined cpu in verilog, algorithms, writing a multi-threaded OS, implementing animation engine in opengl, quantum computing, machine learning (lots of theory and lots of practice), group theory to name a few.

I thought my degree was a bargain at the state school I went to. Also majored in math. Both CS and math had so many interesting classes, I found myself wishing school was 6 years instead of 4. Work is hardly that cutting edge compared to what we learned in school, which woukd cover the latest stuff in the literature in some classes.

Most of all, I learned that getting stuck at problems is normal in college. You have to be patient, spend a lot of time and slowly make progress. That helps me immensely in my current job, esp. debugging complicated problems.

drdec|3 years ago

> Even today I sometimes wonder what software engineering students learn in 4 or more years. It shouldn't take nearly that long. If you spend 20 hours a week studying software engineering you should be ready to find work in less than a year. And from that point onward, that's where you actually learn how to do it right.

This is the difference between college/university and a coding boot camp. At college, they are trying to teach you a breadth of subject matter and experiences to turn you into a well-rounded, educated person. At a coding boot camp, they are giving you vocational training and nothing more.

Each approach has its benefits and drawbacks and neither is appropriate for every situation.

I'm glad you realized that for you, college did not have a benefit, and you saved yourself a great deal of time and money.

Calavar|3 years ago

Oh man, I remember this attitude from a lot of my classmates back in undergrad. "Why are we using Java like dinosaurs? All jobs are in Ruby/Rails!" (Today it would be Node/Typescript instead of RoR)

It really amazed me how many students didn't see forest for the trees. Sure, the college could teach us RoR, but five years from now it will be something else. And sure enough, five years later it was all about Node. And five years from now it will be something else.

Typescript, Node, RoR, and so on are all just icing over the same underlying core concepts that have stood the test of time. Learn the concepts, and you will be an expert regardless of whichever icing is on trop.

When I took our databases course, our professor gave us problem sets with long lists of ridiculously complicated things that we had to write queries for in relational calculus. The problems all ways seemed so contrived. And why the hell were we writing them in some stupid mathematical notation instead of code?

But when I started my first job, I found that I had a much better understanding of how and when to use joins, derived queries, and subqueries than some of my colleagues, who used "where in" clauses everywhere. And if they got worked into a corner, they queried a huge chunk of data, brought it all in over the wire, then used a soup of procedural loops and ifs to filter out what they wanted. Unsurprisingly, their code wasn't very performant and was filled with bugs.

I ran into a similar thing when I got into an argument with a guy about JS on the server. He said JS was revolutionary because it allowed for async IO. And I said what's new about that? You could do that in Ruby too. The guy refused to believe me. He legitimately thought that because Ruby didn't have an "async" keyword that it couldn't do async IO. He knew the syntax sugar de jour on top of async concepts, but he didn't understand the concepts underneath. If fads move on from JS to a new language that has a different async programming model, what is he going to do?

You can learn SQL or Node from online tutorials or a coding bootcamp. And it will feel more useful than a college course because they give you concrete examples right away. But they will only teach you the surface dressing. They won't push you to understand the tough underlying concepts because that isn't easily done in a single article or a three week crash course.

tomjen3|3 years ago

I went through the normal course and the programming part wasn't that hard and those of us who could do so already got to skip it. Those who hadn't programmed before learned what we could do in about a year.

The classes that destroyed people were Algorithms and Datastructures, distributed/parallel computing, programming language design, OS design, low level hardware design (here is infinite transistors and infinite resisters, now go build a computer) and whatever the two classes we had that covered Sipsers Introduction to Computation was called.

These were all classes that covered stuff you wouldn't ever hit upon when you were programming, but which are necessary as to know as a Computer Scientist.

Then there were all the classes that were, at even the smallest level, related to human computer interaction, which were entirely a waste of everybodies time, including the instructors.

legacynl|3 years ago

Although I get where you're coming from, I think you're taking a big risk by assuming you know everything there's to know already. The fact is that you can't know what you don't know. You could be dunning-kruggering yourself on a daily basis and there's no way for you to know.

> These professors were academics. Google didn't exist yet. They, mostly, hadn't worked in any professional environment. They weren't pragmatic. They were slow perfectionists but also several years behind on the rest of the world.

Maybe you're blinded by your arrogance a bit, because there's an actual field of science dedicated to effective learning, teaching, practicing. Although it's great that you found something that worked for you, it doesn't mean that you've had the best or most optimal learning experience. Every teenager thinks they're smarter than their stupid dumb teachers, but they often aren't.

There's a reason why things are taught in a certain manner, and why there isn't that much change in that. It's because these methods have been tried and tested, and there's no need to chase each new framework, method or technology, because it's all built upon the old stuff anyway.

These courses are meant to give you a broad understanding of everything there is to know about computer science. Specifics change, but generics don't. If you know the generic things it doesn't matter what the specifics are.

lupire|3 years ago

This is going too far.

College lectures exist for two reasons: 1) because books didn't exist 2000 years ago, and 2) it's the only thing 1 teacher can do when stuck with more than 20 students at the same time.

Modern research shows how bad lectures are, but many colleges are still lectur focused because of tradition.

lolinder|3 years ago

Like a sibling comment, I find my degree to have been well worth it. I did know everything I needed to find work before I even started—I got a job in my second semester that I held all four years. But college gave me perspective on just how much there is that I don't know, in computing and in every other field. Most of those things are things I will never learn and use, and that's okay. It's valuable to me to know what's out there, and there have been many times where I've come across a problem and known what to research to solve it because of a college class I've taken.

So, yes, college isn't about career training. But life isn't about career. I know it's not everyone's experience, but for me, college made me better at life.

Beldin|3 years ago

> My high school diploma wasn't good enough to get into a software engineering university in my country (the Netherlands), so I had to wait ...

I also hadn't heard of the (Dutch) Open Universiteit in 2001, but it allows anyone over 18 to start an academic study.

If you're a successful autodidact, you probably have at least a reasonable "academic attitude" - the main difference between a university and vocational training (MBO / HBO).