top | item 2898790

Apprenticeships - Employers Must Get Past Degree Snobbery

101 points| csjohnst | 14 years ago |codemanship.co.uk | reply

85 comments

order
[+] brudgers|14 years ago|reply
Posing Apprenticeships as a significant alternative to higher education is delusional. First because a true apprenticeship requires a contractual relationship between the apprentice and the master. That sort of contract is extremely problematic to modern business entities because it does not allow flexibility in staffing levels, requires a significant investment in training (several years), and because indentured service is generally frowned upon and body warrants are hard to obtain these days -- there is little viable recourse should the apprentice terminate the apprenticeship early.

In addition apprenticeship is difficult because it easily runs afoul of equal opportunity expectations and requirements (in the US). The difficulty in differentiating between individuals and in determining each person's unique skill set before they are on the job is the reason all soldiers go through the same basic training and then are assigned to their specialties (of course one could argue that military skills are often determined before hand - but that is an argument for prior training (college) not against it).

Indeed the significant latitude of the military is an indication of what is needed to create any semblance of a workable apprenticeship program which provides equal opportunity on a large scale - an organization where meritocracy is highly advantageous both to the organization and the individuals who lead it, extreme prejudice in the enforcement of contracts (e.g. execution for desertion), and very one sided contracts (e.g. imprisonment for AWOL).

Modern higher education has grown because it offers such a powerful solution to many of the problems created by apprenticeship particularly lack of equal opportunity, exploitation of apprentices, diversion of resources to training and away from profit making activities, and long term commitments to particular individuals who may not be suited for the profession.

[+] gfdgfdgfd|14 years ago|reply
"Posing Apprenticeships as a significant alternative to higher education is delusional."

This is absolutely absurd and just shows pure ignorance of the economic realities beyond your grasp. Germany thrives off of apprenticeships. You can't find a job without having an apprenticeship under your belt in Germany. Students spend between 50% to 70% of their time at a company, while the rest is spent on traditional education. Apprenticeships are a vital source to Germany's economy. Dismissing it based on your pet theories is bordering on asinine.

http://en.wikipedia.org/wiki/Apprenticeship#Germany

[+] michaelfeathers|14 years ago|reply
This isn't just a hypothetical approach. In the Software Craftsmanship community, consultancies have run apprenticeship programs for years with very good results. Scaling to large businesses might not be easy, but they certainly aren't the only game in town.

I'm proud of my CS background and I wouldn't trade it, but it's interesting to work next to terrifyingly bright guys in their 20s with deeper grasp of design and modern technology than most CS grads.

[+] imperialWicket|14 years ago|reply
Good points regarding the shortfalls of a proper apprenticeship in this instance (particularly in the US).

However, I strongly disagree that modern higher education does much to combat exploitation of new grads, resources diversion to training, or long term commitments. Higher education often fails quite admirably on these points as well.

[+] jackpirate|14 years ago|reply
The difficulty in differentiating between individuals and in determining each person's unique skill set before they are on the job is the reason all soldiers go through the same basic training and then are assigned to their specialties

At least in the US, that's definitely not how it works. Enlisted folk take the ASVAB before even joining. Based on those scores, they select what job they want to do before even signing on the dotted line. Recruiters are infamous for lying to enlistees about what jobs are actually like, or about whether enlistees will be able to change rates after/during bootcamp. In my experience (as an officer), most enlistees feel cheated by the system, but must "suck it up" because it's the military and they've signed away their life.

Officers join either via ROTC/Service Academy or OCS. With OCS, all officers are assigned their specialties before signing on the dotted lines. (Imagine if a doctor had to join without being guaranteed a spot as a doctor!) In ROTC or a Service Acacademy, officers have 4 years of training before they select which community to join. About 1/2 the officer corps joins this way, so about 5% of the overall military.

[+] barrkel|14 years ago|reply
Higher education in countries like the US that usually assumes taking on loans already turn it into a form of indentured service - service on the loan that is, a loan that can't be gotten rid of even through bankruptcy.

Could an apprenticeship be structured like an education loan, but one that the company reduces the balance on when you stay with them?

[+] johnford|14 years ago|reply
To your point about military specialties, the range of military occupation specialties that an enlistment candidate is qualified for is determined by the enlistee's ASVAB score at the time of enlistment, not their performance during training. It is true that during training, enlistees undergo further testing (DLAB, etc...) to identify candidates for specialized and rare positions like linguists and such. And some lucky, high-scoring enlistees are given their choice of jobs prior to enlisting as an incentive for signing on the dotted line. But the bulk of enlistees are already pre-destined for an assignment, based on the needs of the service -based on body-count, not test scores-, the day they step off the bus for basic training.
[+] iqster|14 years ago|reply
Your third paragraph reminded me of grad school!
[+] DanielBMarkham|14 years ago|reply
Here's my rough criteria when selecting applicants for technology roles:

1) What have you done lately that is like what I want you to do?

2) What is your attitude like? (past references very important here)

3) Have you taken some test or certification (or can we give you one) that demonstrates skills in areas we might be concerned about?

From there, perhaps you can start learning, that is, it might be worthwhile to talk about a position. But all of that factual and procedural knowledge will be put to the test when you are inserted into our actual environment where your social skills are going to have as much to do with your value as your technical skills.

None of that involves a college degree (unless the job duties and environment mimic the college experience), and it all fits nicely into some kind of apprenticeship program. Yes, there can be a lot wrong with apprentice programs, but "apprentice program" is a very, very broad term. The trick is going to be in the setup and execution of the program.

I freely admit that we apprenticeship supporters wave our hands around a lot while yelling "apprenticeships! apprenticeships!" without providing necessary detail. But I really feel that under this rubric is where the eventual solution will lie. We need to bring education down to be as close to the actual work environment as possible. We need more rapid feedback loops in education and more specific, tailored in situ instruction. Apprenticeships do this.

Note that there is another topic -- the importance of a classic liberal education -- which I am a huge supporter of. But I think we have mixed up two concepts: things that directly translate into money for me and my family and things that make me a better overall person. Both may or may not be important to a particular person, but by mixing up the terms and lumping them all under "college education" it has confused the education argument to a terrible and unnecessary degree. This confusion is what is at the heart of the seemingly-unsolvable education discussion.

[+] brudgers|14 years ago|reply
What a college degree does is that it provides a more equal opportunity for people to obtain a minimum qualification. Let's face it, Ben Franklin's apprenticeship as a printer for his brother more or less typifies the way in which people have obtained apprenticeships - through close personal connections between the master and the apprentice. This is unsurprising when one considers the degree of upfront investment by the master entailed in taking on an apprentice. Those who can and are willing take a street waif under their wing are few.

I will add that calling mentorships and internships "apprenticeships" does not make them such. And an apprenticeship requires a formal written commitment not only to teach the apprentice how to do their job, but to teach them how to be a master.

[+] argv_empty|14 years ago|reply
I freely admit that we apprenticeship supporters wave our hands around a lot while yelling "apprenticeships! apprenticeships!" without providing necessary detail.

A lot of those details could be worked out along the way. I'd be more excited to see the supporters provide actual apprentice programs than details on how everyone should operate such a program.

[+] ajkessler|14 years ago|reply
Many employers use a university degree as a proxy to judge whether or not you're employable, not whether or not you are qualified. As the article rightly points out, it's easier than ever to get a college degree, both because there are more universities offering them than ever before, and because the requirements to get those degrees are lower than ever before. Obtaining a degree merely shows whether or not you have the minimal foresight and work ethic required to be admitted to a university, and the even more minimal determination and resilience to actually get the degree. Thus, many employers think "If you can't muster a degree, in this lax environment, you're probably not going to be a good employee."

That said, if you can prove you have that determination, work ethic, and competence to do the job, many employers wouldn't give two shits if you never got an otherwise meaningless degree. The problem is, there's not a lot of other great proxies to demonstrate those qualities, especially proxies that would save your resume from getting tossed immediately.

As to the "degree snobbery" thought, it makes a lot of sense for some employers to use universities as their recruiting system. Think about law firms that only hire Harvard law grads. Snobbery at its worst, right?

Well, say Harvard Law gets 10000 applicants each year. This group is already self-selected to a certain extent, because most people who don't have a shot at getting in don't even apply. Next, Harvard only selects the most elite candidates (those with perfect scores only have about a 50% chance of getting in). So, anybody who makes it out of Harvard, even if they're at the bottom of the class, has already been screened extensively. If an employer just picked totally at random from this Harvard pool, he's got an excellent shot at picking a great employee, because the barrel he's choosing from has already been screened for him. This can hugely reduce the amount of effort an employer needs to put into making a hire, and so, even if lots of qualified candidates are overlooked, such a system might still make great sense.

[+] dman|14 years ago|reply
I think the idea of setting time aside in your youthful life where you invest in yourself and expand your horizons is a powerful one. It makes the world a more interesting place and pushes human achievement. To what extent a university helps for this might be debatable and will depend on the person and the university, but I wonder if an average person would earmark 4 years of their life to learning and improvement left completely to their own.
[+] shrikant|14 years ago|reply
Patio11 said it best (re: "do I have to go to college??) http://news.ycombinator.com/item?id=1182552

edit: just re-read your comment, and I realise you're not necessarily taking sides and merely speculating if the average person can get by without the investment into college. I agree so strongly I would upvote you twice.

[+] kayoone|14 years ago|reply
Well computer science is not only about programming. Programming is a big part of it and people without a degree might be good programmers, but do they really have gone through all the math and theory by themselves or just skipped it alltogehter ? For example most people without a degree might not know what Big O Notation is or how it works or why certain data structures are better than others.

So it depends much on the work someone is going todo, relativly simple programming tasks ? No degree required. Working on something that scales to millions of users or has to run with exceptional performance ? A CS degree would atleast tell me the candidate has learned about the theory required for this.

Generally for a good CS grad its easier to get really good at programming, than making a programmer comfortable with all the theory.

Try to get into any of the TOP software companies in the world without a degree and i wish you good luck (not impossible though).

I say this as someone who quit half way into his Bsc to start a company btw ;)

[+] nicpottier|14 years ago|reply
My experience interviewing says that a great many CS graduates still have no idea about Big O, or even really understand hashtables. That's from years of interviewing quite qualified candidates coming into AMZN BTW.

They might have passed the class and test where that was covered, surely it WAS covered, but they didn't actually retain it in a meaningful way.

I'm a dropout and worked at AMZN for a few years BTW. I don't think I would have had any problem working at Google or Amazon if I had wanted to either.

Granted, I probably couldn't have worked at those companies as my FIRST job, but that isn't the argument. The argument is who is better off after four years, someone who attended university to get a CS undergrad, or someone who worked in the trenches.

I'd say if there were more good apprenticeship positions available the latter would almost always be better.

[+] Permit|14 years ago|reply
I agree. The Co-op program I'm in allows me to switch between work and employment every four months. While I've found I learn more practical knowledge on the job, I've also noticed school forces useful concepts on me that I wouldn't have learned otherwise. This includes assembly level programming and the theory and math that you've mentioned.

I think the decision to enroll in university or to abstain is is entirely dependent upon the degree.

[+] astrec|14 years ago|reply
I think your point is generally true of computer science graduates, but there are a great many wishy-washy vocational IT degrees for which employer and employee would both be better served with an apprenticeship.
[+] rdouble|14 years ago|reply
Which courses teach how to make something that scales to millions of users, or run with exceptional performance?
[+] csjohnst|14 years ago|reply
I'm of the opinion that a university degree is less about learning the subject matter, and more about teaching you how to learn. So a programmer who has not completed a degree may be a fantastic programmer, but one who has done a degree may have a few extra skills on top of simple programming skills.

I.e. Communications skills, analytical thinking, knowing where to look for solutions, how to ask the questions that improve your knowledge and the proven ability to see through a project etc...

[+] arethuza|14 years ago|reply
When I completed a CS degree in '88 I remember thinking that what it was really doing was lining you up to possibly go on to do postgraduate research - which I did eventually do. If you aren't going to be doing something that is vaguely like research then I'm struggling to see the relevance of CS degrees for most development jobs.

Universities are really rather splendid places for research and absolutely awful at vocational training!

[+] guard-of-terra|14 years ago|reply
You don't learn the same way like you did fifteen years ago. You don't go to a library. Instead, you google. And try. And fail, and try more, and google more.

Maybe you still run into one fundamental problem or two where you need to read something hard. But otherwise, the "how to learn" thing is more obsolete than the other things claimed.

[+] St-Clock|14 years ago|reply
So the solution to increased cost of education is apprenticeship? Is this part of the current trend at bashing higher education to make sure that we become obedient but efficient drones?

There is a lot of things broken in higher education, but saying that hiring a CS/SE graduate for a developer position is like hiring a theoretical physicist to repair a car is disingenuous at best.

Guess what, graduate students who write a compiler as part of their course, who are working on a vm for matlab, who are improving IDE auto-complete based on all sorts of algorithms, who are devising a new distributed merge algorithm and evaluating its performance through hardcore network simulation, well, they know how to program! As a bonus, they know how to apply the scientific method and be rigorous when they report a result or an improvement. They have been exposed to all sorts of things that an undergrad don't even suspect their existence.

Sure, some grad students aren't good. Sure, people who don't go to college/grad studies can end up being way better and knowing more than grad students, but don't discredit a degree because you believe that it's too theoretical. Just ask about the homeworks, the projects, and the thesis the grad student worked on.

[+] absconditus|14 years ago|reply
"Is this part of the current trend at bashing higher education to make sure that we become obedient but efficient drones?"

No, the goal is to remove vocational training from higher education and return it to its supposed goal.

The graduate student that you describe is rare. A masters degree in CS typically means that the person took more random computer courses and knows no more about software development. What you fail to realize is that nearly anyone who wants a masters in CS can shop around and find a school that is willing to accept them.

[+] falcolas|14 years ago|reply
A graduate student is a bit different from a student just getting a bachelor's degree, which is where this article is focusing.
[+] ig1|14 years ago|reply
Imagine you offered a job position of "software apprentice" and say you got a thousand responses. How would you identify the ones that would be worth gambling on ? - there would be little to nothing to differentiate candidates.

By and large you'll have candidates who didn't do well at school (because those who do well at school tend to go to university), have never done anything to show that they're capable of long term commitment, and have never done anything significantly intellectually challenging.

CS & SE courses at top universities which get to have the pick of the best students still have a first year drop-out rate of 10-20%.

Any company offering software apprenticeships can expect to suffer huge drop-out rates with minimal upside.

[+] ohyes|14 years ago|reply
How is that different from any other job application? The job will clearly go to the boss's nephew, who is totally great at computer stuff because he plays so much WoW. (Though it is a well known fact that the best middle management comes from Eve Online).

Seriously, you will generally want to look for people who are intelligent and good at abstract thinking. In my biased opinion, you should be shooting for the best humanities majors (In disciplines like English, music, philosophy and math), as the targets of such an apprenticeship. Not high-school graduates.

Until it catches on, your competition for employing them will be Starbucks and grad school which means they will be a steal compared to programmers trained at Stanford or MIT (TM). A liberal arts major would be pleased to be making 30-40k a year out of school. Also, you can always fire them if they don't work out.

If it then catches on, you might be able to expand to people who are interested in making programming a career, then you might be able to catch some of the

Currently, CS courses at top universities have very little to do with the business of actually programming, they are more based on mathematical theory. SE courses tend to look more like 'software engineering management.'

In each discipline, you learn important skills... but they tend to have very little to do with the craft of sitting down at a computer and making your thoughts into working code. (I have a master of science in computer science, I learned many things that made me a better coder, but the degree didn't teach me to code, work experience did).

[+] hamidpalo|14 years ago|reply
Degrees are usually a signal about how well the person is expected to perform. A degree from a top school with a decent GPA means that not only was this person select among many to attend, but they also managed to go through it okay. That is why employers ask for a degree in anything for certain jobs. They may not care about art history, but a degree is a good signal that "hey this person is smart."

For CS it's a little more practical but a degree is still fundamentally a signal. As is an active github account, blog, etc...

[+] sharmajai|14 years ago|reply
We have to stop pretending degrees are worthless. The way I see it, going to school, expedites your learning process, by exposing you to professors and your peers, it helps you learn best practices, which although generic, saves you a lot of time making same mistakes others (your professors and your peers) have made. Also it gives you the focus and urgency to finish your learning on time.

As an analogy, consider the knowledge accumulated while atending school as open source software, even if your knowledge or the OSS is generic to be fully usable for the task at hand, it almost always gives you a big headstart to get your job done, because it avoids the trivial and non-trivial pitfalls through years of maturity.

[+] imperialWicket|14 years ago|reply
It is true that degrees are not inherently worthless. They can indicate a lot of prowess in an individual. The problem is that the process of attaining a degree has become too standardized and universal. Now, it is easier for someone to game the system, thereby acquiring a degree without any of the intended benefits of the process.

The meritocracy and strict requirements of more individualized approaches in open source is a good alternative (see my other replies).

Bearing these concepts in mind, I agree that it is important to maintain that a degree is not a black mark on a CV. It remains just as important to remember that a degree is also not the shining star on a CV that it once was.

[+] mattdeboard|14 years ago|reply
You're conflating the degree and the relevant knowledge gained in pursuit of the degree. They are not one in the same, and I believe that is the point most people are making. At least, that's what I believe.

The skepticism directed at college education is a result of cost/benefit analysis. I would gladly take a 2-year associate's program that yielded a degree in computer science, much like nurses have an 2-year nursing degree. The four year degree I'm in pursuit of now (in my 30s)? I'll likely drop out after getting the discrete math, algorithms and other math-intensive courses I likely would not study on my own.

[+] eftpotrm|14 years ago|reply
Agree absolutely.

I'd long ago decided that even with the £3,000 per year fees (which Labour said they wouldn't do in their election manifesto, got an overall majority then did anyway - remember that Aaron Porter and others slamming the LibDems at the moment), the investment in a degree before a career in software just didn't seem to add up. Realistically a £25k debt against a 3 year delay in starting a career - you may well earn less for the first few years but by the time you've paid off that £25k, is the degree really going to be a differentiating factor?

With now £27k just on tuition, plus living expenses for three years, what's the point? Honestly, I learnt more that I use professionally at A level than in my degree, let alone what I've learnt professionally. Sad to say this but I would actively recommend against an 18 year old with an interest in working in IT studying at university, the way things are at the moment.

[+] nickthedart|14 years ago|reply
Agree with this. With £27k debt for tuition alone, plus other debts for living costs, in a field that changes so fast as Computer Science, you'd be paying off this debt long after much of what you'd have learned would be obsolete. Far better to go straight into work, and study part-time (if employers will take you without a degree that is, which they might if they realise that 18-yr olds can be smart and cheap). It'll be interesting to see what 18-yr olds do in response to these fees. Sadly I fear many may not be clued up at that age, and will study Computer Science then regret later.
[+] akmiller|14 years ago|reply
I think it says something about the kinds of employers that look at people who have spent upwards of 100k to obtain a piece of paper that implies some arbitrary level of understanding in a given subject. The same level of understanding that I could get on my own for probably a few hundred bucks.

Don't get me wrong, I'm not against higher education in theory. We've gotten to a point though were it's not about the education anymore, it's about the diploma. We enforce this idea in our high schools that you can't be successful without one and as such send our kids in droves to Colleges. We've artificially made the demand so high that these institutions can charge whatever they want and the kids are still going to attend and put themselves (or their parents) further and further in debt.

[+] andylei|14 years ago|reply
> says something about the kinds of employers that look at people who have spent upwards of 100k to obtain a piece of paper

great you're an employer. for fun, let's say you're google, except you just ignore the 'education' line resumes

> The same level of understanding that I could get on my own for probably a few hundred bucks

oops, now, you have 10k resumes of people claiming they have knowledge of computer science. but you only need to hire 10 people. what do you do?

[+] timtadh|14 years ago|reply
> Let's face it, the best experts aren't the ones who knows all the answers, but the ones who know where to look for the answers.

In many ways I feel like this is exactly what a degree in Computer Science or a related field gives you. Becoming an educated member of society isn't about learning the "correct" answers it is about learning to ask the right questions. It is about learning some questions outlive their answers. Universities enable students to glimpse the horizon of human understanding. A glimpse of the infinite unknown.

I understand people feel burned by: their experiences at university, hiring practices of corporations, poorly performing well credentialed hires, and the cost of education in general. However, let's not toss the baby out with the bath water here.

@brudgers said it best:

> Modern higher education has grown because it offers such a powerful solution to many of the problems created by apprenticeship particularly lack of equal opportunity, exploitation of apprentices, diversion of resources to training and away from profit making activities, and long term commitments to particular individuals who may not be suited for the profession.[1]

[1] http://news.ycombinator.com/item?id=2899059

[+] imperialWicket|14 years ago|reply
I could not agree more, and although this applies to a broad range of industries it is extremely pervasive in software development (Related, I have heard a few computer science folks highlight that computer science as a major is not targeted at creating software developers - http://news.ycombinator.com/item?id=1884255). True enough.

I stand by my sentiment on that post, that open source projects are a great 'apprenticeship' opportunity for those interested in computer science-like fields (software/web development and the like). That said, I have participated and watched at http://opensource.com and http://teachingopensource.com and come to realize just how difficult it is to get opensource into the educational system.

Knowing that difficulty, I might categorize experiences in the following best to worst order for new hires:

1. Active open source contributor 2. Active open source contributor w/ non-CS degree 3. Active open source contributor w/ CS degree

Two additional notes:

1. These are not meant to be absolutes, there are certainly individuals who fall into the above-mentioned category 3 that far surpass a category 1 candidate in a particular skill. I am merely suggesting that at a high level, the likely skill-set available to a category 1 candidate is often more desirable than the likely skill-set of a category 2 or 3 candidate. A lot more could be said here, but it is not the point of this post. 2. While this most obviously applies to software development, it also has a natural home among technical document authors, marketing, customer relations, QA, and many other aspects of business that exist and flourish in open source communities.

[+] delinquentme|14 years ago|reply
I've run into this exact issue.

However when you've got the managing parties who have degrees, they seek out justification for their expenditure, through self-seeking-self activities, consciously or subconsciously.

Additionally when grad students are willing to work, and have been trained in the shadow of those who are managing, all the gears in the system work as expected.

The issue comes into play when you've got kids like myself. We are ITCHING to learn, fail to see the benefit in 4 years of learning. And most of all:

We know the fastest way to the bleeding edge of research is NOT through the "tried-and-true" channels. It is most efficiently attained through jumping right into the fray and working with said researchers.

THIS is what will create that innovation that we seek. The degree system simply is the easy answer and a way to continue with the entrenched system, by providing slave labor.

[+] Jach|14 years ago|reply
I'm skeptical that this is going to happen on its own any time soon. While the legal frameworks are in place to prevent discrimination on sex/race/age/etc., I think we should put similar mechanisms in place for formal education. Especially when so many degrees are worthless as a measure of skill, so they've become irrelevant to the job at hand just like a person's race.

Make it so that employers can't ask for education, just like they can't ask for age, nor make a degree a job requirement. Of course when an applicant comes in for an interview, their race and relative age quickly becomes apparent, so it's not really a matter of information hiding as removing a more-and-more irrelevant filter. There's also nothing stopping an applicant from explicitly exposing their age/education/etc. on a resume or during the interview, and I'd still want to mention an MIT education if I had one. At the very least you would want to talk about school projects since you may not have any other experience, but that's up to the applicant. The question is "What things have you made? How did you do it?", not "Did you take a data structures course at an accredited university?" I'm not even sure it would create that much extra burden on HR departments since I hear they're already swamped with applicants matching degree requirements.

On the other hand, a free market approach may be to just leave it alone and let the tech companies that require CS degrees, or black people only, suffer to the companies that care about skill alone. I'm pretty okay with that too as a practical outcome. The question there becomes really philosophical and whether you want a big government to slim down in an inconsequential way or continue its historical path of trying to enforce certain moral directives on supposedly less enlightened people.

Downvoter(s): would appreciate a discussion on which idea(s) is/are most offensive to you. There's the additional filter of "this person made it through a 4 year program and may therefore be determined/have long-term goals/etc.", but really I don't find that a very compelling or useful filter for many jobs.

[+] aidenn0|14 years ago|reply
The problem with this is that universities get away with screening processes that would be problematic at a company. For example, screening applicants based on SAT scores is a legal gray area for a lot of companies, but the college you went to is a proxy for your SAT score.

I'm convinced that more than half of the value of hiring someone from a top-tier college is who they admit, rather than what they teach.

Frightening true story: Someone I know at a government contractor startup was hiring a fortran programmer. As part of the interview test he was giving a simple fortran sritten test. The company lawyers found out and had him stop. Apparently tests that haven't been vetted for cultural/racial biases are a potential source of liability for government contractors.

[+] crs|14 years ago|reply
So very true. At the company (very large defense contractor) we have to ask the same questions to every applicant. We can't ask follow on questions or deviate from the pre-defined question list. It was deemed legally unfair to ask different applicants different questions. That makes it harder and harder to distinguish the good from the bad.
[+] shmageggy|14 years ago|reply
I suspect there are countless CS graduates who can describe Binary Search theoretically but couldn't hand-roll a binary search implementation to save their lives

If this is the case, and I doubt that it is very often, then your CS program has failed you miserably.

[+] timtadh|14 years ago|reply
I believe Knuth says in his article on Binary Search in the art of computer programming that it took some ridiculous number of years from when Binary Search was first described to a implementation free of bugs.

I find it highly likely that the majority of programmers would produce a buggy version of Binary Search on their first attempt. Why? History indicates programmers often make small mistakes even when writing simple algorithms. A survey of 26 papers on variations of binary search found that 6 of the papers had serious errors in the published algorithms.[1] 4 of the errors were "design" errors. That is the algorithm they designed had a fault. 2 were implementation errors (1 in assembly, 1 in cobol). All of the errors were published in a peer reviewed publication. Therefore, even peer review does not always spot errors in a "simple" binary search algorithm. Why would you expect recent graduates to do any better?

[1] http://comjnl.oxfordjournals.org/content/26/2/154.abstract

[+] CptMauli|14 years ago|reply
A "Fachinformatiker" (for some reason translated as specialist by google translate) you can become through an apprenticeship in Germany for some time (there where predecessors to it with different names)

see http://translate.google.de/translate?hl=de&sl=de&tl=...

oder auf deutsch http://de.wikipedia.org/wiki/Fachinformatiker

so it is nothing new at all.

[+] eiji|14 years ago|reply
As a fellow german, but not a Fachinformatiker, I suspect colleagues without a degree still have a hard time to work their way up into more advanced management roles or just into the higher income brackets. I wish that would not be the case though.

What I find intriguing, although I'm surrounded by "Programmers", almost no CS! We have physicist, mechanical engineers, mathematicians, historians and english majors even.

[+] brackin|14 years ago|reply
Most (if not a lot of) startups i've seen look at portfolio and experience more than do you have a degree, anyway. I'd rather hire an 18 year old who's got lots of great stuff on Github, internships and other experience.

Than someone who's just finished their Computer Science degree from a random uni, with no experience. Almost all of those I know going to university are just planning to do their course and apply for jobs afterwards, expecting their degree to mean instant an instant job.