top | item 9758625

Business Can Pay to Train Its Own Work Force

165 points| petethomas | 10 years ago |chronicle.com | reply

145 comments

order
[+] learc83|10 years ago|reply
The worst class I had in college was Software Engineering. It was the university's attempt to prepare us for the work force, and it was taught by an adjunct who had plenty of industry experience, but it was already a 10 years out of date.

Industry processes are mostly fads that change with wind. CS fundamentals however, are much more stable. 20 years from now knowledge of automta, graph theory, and complexity analysis will still have immense value--a scrum master certification won't.

[+] manyxcxi|10 years ago|reply
I also had a Software Engineering class (actually two them) that were focused on how to build software in real life. This was in '03 and we covered things like waterfall methodology, requirements gathering, functional specs, etc. If taught the exact same way today it would be woefully out of date but the time we spent on requirements gathering (where the teacher or TA) pretended to be a product owner and purposely gave really crappy answers and we had to extract useful information but by bit was one of the best pieces of prep I ever received.

All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology. I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...

[+] mathattack|10 years ago|reply
The article isn't complaining about CS majors not getting jobs, it's about soft liberal arts majors not getting jobs.

When companies hired people for 30 year careers, they could afford to invest a tremendous amount in training. When they hire people for 2-3 years, they have less time to amortize the costs. And it's up to the employee to convince the companies that they can learn quickly, and on their own if need be.

Any CS major with decent grades and a positive attitude can learn anything in most any job. (Certainly CS, consulting, finance, marketing, even some kinds of sales) I can't say the same for liberal arts majors. There are great ones out there, but also a lot of folks who goofed off for 4 years and didn't learn anything.

[+] samirmenon|10 years ago|reply
You could take that one step further. CS has, in it's modern incarnation, only been around for ~90 years. CS, as it is taught at the undergraduate level, has changed dramatically in recent years, and will continue into the future.

Math, on the other hand, has been around for thousands of years, is relatively stable, and unlikely to become obsolete in the way a scrum certification, or even a machine learning algorithm, will.

Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.

[+] genericuser|10 years ago|reply
My software engineering class spent most of the semester going over design patterns which are in fact quite useful to learn in school. And then maybe 1/4 of the class on going over the various development methodologies. I agree that a class devoted entirely to methodology would be complete over kill. However I think there is room for getting some exposure to it in school. Ideally before taking higher level classes, where having knowledge of a existing ways to structure your group work will be beneficial.
[+] brixon|10 years ago|reply
If college education is too specific to one company then you are locking yourself into that company and you will lose any leverage over salary or even leaving.
[+] bigger_cheese|10 years ago|reply
Oh god yes I went through the same pain. Our university decided that every Engineering student had to take two CS courses during the first year. This applied regardless of whether you were studying Civil Engineering, Electrical, Mechanical etc. This was alongside all the other first year courses we were required to take such as Chemistry, Physics, Calculus, Linear Algebra, Statics, Dynamics and all that.

First Semester we took Intro to Programming and Algorithms (CS1100 or something like that). We learnt everything from Binary notation, logic gates through to floating points, I went from not knowing any programming language to having a fair idea about how topics like recursion worked our last project was to write some code to walk a tree using recursion. I also learnt how to use Unix and recieved my lifelong love of Emacs (which we used in our tutorials) from this course.

Second Semester we had to take Software Engineering (CS 1110 I think). It covered waterfall model, version control, specifications, unit tests and more estoteric stuff like loop invariants and formal correctness. Our major project was to write an essay about the Arriane V rocket explosion. I really enjoyed CS 1100 class but CS 1110 effectively succeeded in boring me to tears. It was somehow supposed to give us a taste of real world software design. All it really did was encourage myself and many others not to take CS electives in later years.

My major was Materials Engineering I took mostly physics electives in the last year of my degree, which is kind of ironic now because I work pretty heavily with programs - my day job involves writing computer simulations. I learned most of the skills I need on the job (including the C programming language, databases/SQL etc.) I probably would have benefited from more formal training but my experience of university CS was so miserable I actively avoided it.

[+] kartickvad|10 years ago|reply
I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.

In college, I didn't learn version control, continuous integration (continuously submitting your work in small changelists or patches), unit testing, making sure you're building the right product before building it, delivering the simplest possible code and design that meets the requirements, code quality, working in teams, untangling dependencies and making as much progress as possible today without waiting for all your dependencies to be resolved, and so on.

I expect that all these skills will be very much relevant 20 years from now. So, don't confuse long-term value with "grounded in CS fundamentals". Programming isn't a hard science like physics.

[+] bduerst|10 years ago|reply
I guess it depends on the purpose of the education. If you're already getting the CS fundamentals, maybe it doesn't hurt to get up to speed on some of the industry fads at the moment, since most of the students could be looking for an industry job in two-three years.
[+] yummyfajitas|10 years ago|reply
For the most part, bonding agreements ("you can't leave for X years without repaying us for your training") are considered exploitative and usually not legally enforceable.

As a result, a business can't pay to train it's own work force - if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

If an investment can't be protected it's pointless to make it. Having employees pay for (and be compensated for) their own training is the most reasonable workaround.

[+] anigbrowl|10 years ago|reply
For a mathematician you sure like fallacious arguments.

As a result, a business can't pay to train it's own work force

Well, they could offer more pay after the training is completed, or focus on being a nice place to work such that employees would choose to stay put with a company that treated them well. Your argument rests on the implicit assumption that employees will leave the moment they receive a competitive offer and are only interested in maximizing the take-home pay aspect of their economic advantage. Employees are motivated by a combination of monetary compensation, benefits, and good will towards a firm in the same way that a firm is valued by both its book assets and its good will in the marketplace. You assume, without foundation, that employee loyalty towards an employer that provides training will be nil, and ignores the possibility that employees might look forward to receiving future training (and promotions) which would have significant economic value, whereas a firm that poaches employees and offers no training of its own presumably won't be offering any in future either.

If an investment can't be protected it's pointless to make it. Having employees pay for (and be compensated for) their own training is the most reasonable workaround.

You could equally argue that it's pointless for employees to run up debt buying expensive an education which an employer might then say doesn't quite meet their requirements and therefore shouldn't be rewarded by additional compensation. Considering the non-dischargability of student loans in bankruptcy and the financial weakness of employees relative to employers when it comes to negotiating prices for training/education, I would say it's rather irrational for employees to take on all the financial risk involved.

[+] mrbabbage|10 years ago|reply
https://en.m.wikipedia.org/wiki/Golden_handcuffs

How would training followed by a bonding period be considered differently than other mechanisms inducing employees to stay? Could these other mechanisms also be legally questionable?

E.g. some San Francisco Bay Area technology companies offer large (~$20k) signing bonuses to new uni graduate hires that the employee must return if she leaves within her first year at the company. Similarly, companies offer five-year equity packages that deliver no equity until the twelfth month.

[+] mrxd|10 years ago|reply
And yet it was the norm only a few decades ago. What's new is the emphasis on speed and flexibility. If you're starting a new initiative, it's better to hire skilled people now than to start a training program and get them 2 years from now. And if the new initiative fails, laying off people you've trained is throwing away the money you spent on their training.
[+] brixon|10 years ago|reply
Companies have to do stuff like this since employee loyalty was lost when companies started making layoffs commonplace.

Training from a company says that they have a vested interest in the person, but they don't. Companies don't care about you, they only care about the bottom line and do not have any interest in spending effort to teach you something you could use somewhere else.

[+] VLM|10 years ago|reply
"if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete"

Doesn't this assume a lifestyle from several generations ago where you get "the training" then you're set for the rest of your working life? I don't think the real world has worked like that since at least the 70s.

If I took that $90K job I'd be OK while I'm there... thats what, on average just a couple years? Then I'd be dead in the water and have some explaining to do at the interview for the next $80K job that provides the entry level training I'd need. Why would any employer hire an untrained employee if there's a perfectly well trained employee from the $80K job? Meanwhile the guy who stayed at the $80K job likely got a promotion to $100K and here I am trying to get in at the ground floor yet again.

How this fits in with stereotypical OTJ training is a mystery. Most formal training is a way to shovel money to middlemen who provide it in an accelerated format for an extremely large fee. Most real world training is "here's a PC with a web browser and your bosses bosses boss declared the due date for the project is next month, now figure it out yourself".

Think of the last time you worked a project with a junk spec that has no relationship with reality. Those people who can't spec a fizzbuzz if their life depended on it are the ones specifying the required training. I wonder if that training spec will be worth the paper its printed on?

[+] thirdtruck|10 years ago|reply
> if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

And nothing stops the first business from raising the employee's pay to $91k. Why should they expect to keep paying the same salary to someone they've made more valuable? They already amortized the costs of training over the value added by the employee in the first year, right?

[+] mindslight|10 years ago|reply
Your argument rests on an assumption that an employee produces nothing of value while learning. If they are performing useful work while learning, then the salary paid to them isn't being wasted.

How true this is depends on the field and subject matter, but surely any entity will tend to see their own specifics as basic things anyone should already know even though the field is actually much larger.

[+] emodendroket|10 years ago|reply
If you don't make the worker want to take off then it doesn't have to come to that.
[+] s73v3r|10 years ago|reply
That's not true at all. Business doesn't want to have to actually compete for it's employees, that's it.
[+] kartickvad|10 years ago|reply
I think this logic breaks down when you run out of people who already have the skills du jour.

If you can hire people who already have the specific skills you need, sure, let others pay for the cost of acquiring those skills. But, when you run out of those people, then the choice becomes paying to train them, or suffer the cost of leaving the work they'd do undone. If the value of the work they perform is more than the cost of training, it's logical to train them.

In this example, the market rate for someone already trained at a certain set of skills is $90k, and the cost of training is $20k. Nothing prevents a company from offering people who are smart and capable, but not trained at whatever the company happens to need today, $70k. But with a promise (in the employment agreement, not orally) to offer $90k one year from now.

Everyone benefits from this arrangement — the candidate, who now has a better job they otherwise would have (which is presumably why they're taking the offer in the first place). And the company, who's getting a capable employee without spending more on training.

So, the bottom line is that you can always structure your incentives in a way that makes the training worthwhile for all parties involved, without asking the employee to pay for it.

[+] kenko|10 years ago|reply
"As a result, a business can't pay to train it's own work force."

I believe in Germany consortiums of businesses in the same industry subsidize training. Seems reasonable.

In any case, your observation would seem to be refuted by the fact that businesses in general used to do precisely that, and many businesses still do do precisely that.

"Having employees pay for (and be compensated for) their own training is the most reasonable workaround."

For the business, maybe.

[+] sheepmullet|10 years ago|reply
"As a result, a business can't pay to train it's own work force - if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete."

Except for the massive information asymmetries in the market.

Even ignoring the information asymmetry, hiring a new employee is expensive. You can easily end up paying 40-50k to hire a good developer (as a combination of the recruitment process and ramp up time).

So really company B is paying salary + $40-50k recruitment. This means company A can easily match company B as well as provide training.

[+] switch007|10 years ago|reply
> For the most part, bonding agreements ("you can't leave for X years without repaying us for your training") are considered exploitative and usually not legally enforceable.

Extremely common in the UK :(

[+] DoggettCK|10 years ago|reply
I remember some motivational poster from my old tech recruiter's FB feed that seems appropriate here:

"What if we invest in training and our employees leave?"

"What if we don't, and they stay?"

[+] wisty|10 years ago|reply
There's no reason why they would pay $80k in salary if they're not getting enough expected value.

Until a new programmer is making minimum wage, there's always a point at which companies can employ them for less, and hope their value (plus expected value from the ones who stay on at a higher wage) will be more than the costs of training.

Also, training is often domain specific. Employees are generally worth more to their employer than a competitor (though there are exceptions).

[+] analog31|10 years ago|reply
As a result, a business can't pay to train it's own work force - if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

If the training has increased her market salary, then the simple solution is to raise her salary to meet the market.

[+] jackmaney|10 years ago|reply
> if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

Then that business will have to pay better.

You want me to stick around? Fuck you, pay me[1] (or give me some other reason that is compelling--to me, not to you--to stay).

[1]: https://www.youtube.com/watch?v=jVkLVRt6c1U

[+] zrail|10 years ago|reply
The whole for-profit code school thing has been giving me the creeps since I started hearing about it years ago, precisely because it's the potential employees paying for hyper-specific training. I wonder if there would be fewer outcries about a talent shortage if companies were somehow incentivized to hire these more junior people with the explicit goal of training them up to a productive level.
[+] skylark|10 years ago|reply
It's a tough situation. It's hard to identify people who have no background in computer science, but still have tons of potential if given the opportunity.

Coding bootcamps exist to bridge that gap - in a sense, you can think of them as recruitment agencies and not as learning institutions. The goal isn't to replace a 4 year degree with 3 months of intense learning. That's simply not possible. The goal is to find smart, motivated people who can learn quickly, give them a skill which will let them hit the ground running, and present them to companies.

Companies benefit because they get a low cost hire who has the potential to grow tremendously if given the right environment. Employees benefit because they can typically enter the workforce at a higher salary than they could command from self-study alone. Win-win all around.

The key takeaway is that for-profit code schools are not schools. Code schools are primarily recruitment agencies designed to find high quality, non traditional talent. In that regard, I think the best ones are an absolute success.

[+] linkregister|10 years ago|reply
My opinion is divided about the "pay-to-play" model of workers paying for their own training. On one hand, they assume all the risk and financial hardship of training. On the other hand, compared to other developed economies (British, Australian), they have more opportunities to change careers because the risk of training has already been assumed by the worker. I've found that American hiring is far more flexible than the Commonwealth tradition of jockeying for an apprenticeship. And older workers have far less opportunity for apprenticeships.

It's hard to defend the code schools though. There are a high amount of horror stories we hear about them on HN, from both the students' and hiring managers' perspectives.

[+] pjungwir|10 years ago|reply
Today's code schools may be new, but not code schools in general. Riding on the T in Boston 2000-2005 I saw plenty of advertisements for courses in Java, XML, etc. I always assumed those were low-quality classes for low-quality programmers. Nowadays the code schools teach Node.js and TDD, and their branding is more hip. I hope the quality has increased (of both instruction and students), but I still see it as a continuation of the training classes that have always been out there. Not the same people (I think), but the same market need.
[+] tertius|10 years ago|reply
This should be less risky for large corps like MS, IBM etc. I believe this may already be done?
[+] walterbell|10 years ago|reply
The Uber robotics talent raid of CMU took piratization to a logical extreme, http://www.theverge.com/transportation/2015/5/19/8622831/ube...

"They took all the guys that were working on vehicle autonomy — basically whole groups, whole teams of developers, commercialization specialists, all the guys that find grants and who were bringing the intellectual property," recalls a person who was there during the departures" ... Uber snatched up about 50 people from Carnegie Mellon, including many from its highest ranks.

... the deal includes a "transition period" that keeps some of the departed staffers around ... "The work of these employees is very incestuous and loose," says the same NREC insider. "They are given free rein of the facilities as part-time CMU employees, but there are absolutely no checks on the work that they are doing or what [intellectual property] they are taking. Is it for CMU? Is it for Uber? None of us here know."

Edit: could CMU have gotten better IP licensing terms and ROI for the University, if they had spun out the entire team (with private financing) and had an open auction of RoboticsResearchCo to the many companies investing in this field?

[+] cgearhart|10 years ago|reply
I think this is a better example of dysfunction in academia than retaining/training a workforce. In particular, previous articles suggest that most folks saw their salaries doubled, with six-figure incentive checks to lure them away. From what I hear, it's not so much that the new salary is unusually high for industry researchers and engineers, but that the old salaries were unusually low (except in academia).

CMU could probably have gotten better return if they hadn't made it a point to underpay them so much compared to their market value.

[+] wavefunction|10 years ago|reply
Perhaps this can serve as a cautionary tale to the public about the perils of private-public partnering. Especially when the "partnership" is so slanted in favor of the "private" side of things that such a situation can occur.
[+] meatysnapper|10 years ago|reply
Football coaches are often the highest paid employees at big schools. If I was a world class researcher, I'd be pissed as hell about that.

That said, Uber pretty much showed that they are unable to be trusted (if anybody trusts them at all still).

[+] nateabele|10 years ago|reply
> The trick is to relabel it as education, then complain that your prospective employees aren’t getting the right kind.

Well, I guess if already-educated workers are the norm in your industry, companies are going to change their hiring practices accordingly. It's okay not to like it, but arguing against it on the basis that 'it didn't used to be like this' just makes you look entitled and whiny.

> Bemoaning the unpreparedness of undergraduates isn’t new. Today, however, those complaints are getting a more sympathetic hearing from the policy makers who govern public higher education.

Yeah, well, when policy makers are responsible for all the cheap money flowing into the system (without which the system would probably collapse at this point), I guess that'll happen.

[+] cgearhart|10 years ago|reply
> ...if already-educated workers are the norm...

That's the point -- there's a difference between education and training. It is only an issue because so much of the education looks very similar to the desired training.

It is not the purpose of an education to produce new employees, but to provide a broad basis for experiencing and making sense of the world. It just so happens that you also learn transferrable skills like a basic familiarity with a particular field, along with some tools and techniques that help you organize and solve problems. A proper education is not an extended code bootcamp, nor should it be.

[+] paulpauper|10 years ago|reply
One possible solution is cognitive screening - the use of tests such as the Wonderlic, SAT/ACT, or Wechsler to find prospective employees who can learn quickly and have good critical thinking skills (and thus would benefit the most from on-site training for technical tasks. training obviously costs money), but unfortunately something called 'disparate impact' makes this difficult to implement, so employers instead have to let colleges do the screening, turning an advanced degree into a very overpriced, time consuming 'IQ test'. Some people are more concerned about hurt feelings than providing equal opportunities. The 'logic' is if the tests expose a reality that isn't politically correct, we must do away with the test, so the result is more student loan debt, a worse labor market, and more credentialism.
[+] dragonwriter|10 years ago|reply
> The 'logic' is if the tests expose a reality that isn't politically correct, we must do away with the test

Untrue. The logic is if there is a disparate impact against a legally-protected class, such that the test would be a convenient cover for illegal discrimination, and you allow discrimination using the test without demonstration of relevance, it becomes an easy, obvious, and effective tool for those looking for cover for discrimination on an illegal basis; to avoid that, the logic goes, you simply require those who wish to use the test as a basis for discrimination in that case to actually be able demonstrate that the test is meaningful to the job and that it is applied as a discrimination factor in a manner consistent with the way that it is meaningful to the job.

If they've actually done the kind of analysis that would let them know that the test really is useful, this is trivial; it does, however, prevent adopting a test with discriminatory effect against a protected class merely based on intuition or conventional wisdom.

The type of analysis involved may be costly, but then, if it really is something that those wishing to apply have high confidence would be valuable for their business, that type of analysis would be worthwhile to pay for. The reason it is difficult is because none of the people who like to talk about how useful these things would be when talk is cheap wants to put their money where the mouth is on the issue.

[+] theodorewiles|10 years ago|reply
As a lib arts graduate, I might be biased, but I believe that it's still the best education for the type of decision-making that's most useful in real-world business decisions: ambiguous and incomplete information from a variety of sources with competing interests.

That being said, if you're not in a leadership position that kind of decision making isn't what you're doing: you're most likely just optimizing on your own little anthill, so a "profit-centered" education like the article is against might make sense for the worker bees.

Another thing that I think is usually missing from the "train your workers" debate is how much variance there is in productivity, and how actual productivity is usually unknowable unless you have 2-3 months of project data for a given worker (these are assumptions). So hire unskilled contract workers, fire 80% of them, and then train the remaining 20%. They won't have the credentials to work elsewhere, and you've been able to identify the true all-stars using data on their actual work product.

[+] laurentoget|10 years ago|reply
It seems expecting schools to do all the training your employees will ever need would not be a good bet from a business point of view either. If your employees have the exact same skills as the competition's employees how do you expect the business differentiate itself?
[+] wwweston|10 years ago|reply
It's true that business should pay to train its own work force, but I'm not convinced it can.

Businesses that would like to but that operate in a sector where a competitor can successfully externalize that cost will be at a competitive disadvantage.

And businesses that subscribe to managerialism -- the idea that it's primarily management/leadership skills that differentiate a business, rather than domain knowledge -- may not know how to train employees at all, as it takes someone with domain knowledge to know how that can be done...

[+] nitwit005|10 years ago|reply
Companies are often eating the cost of training employees without admitting to it.

Pick some company at random, look around, and there will often be piles of people who have no official training or certifications to speak of, and no relevant degree, but who are perfectly competent at their jobs. Somehow, magic happened, and they were trained, despite no training money in the budget.

A lot of it is just done semi-officially. The boss points at some guy he trusts and tells him to fill them in, and lower productivity is accepted for some period.

[+] joshu|10 years ago|reply
I thought tech internships were more about recruiting than about training.
[+] vinnyc|10 years ago|reply
Students aren't dealing with the proper issues that arise in a regular workplace. Many professors lack knowledge on what has been going on between them leaving the workplace and today. Because of this students are learning principles that aren't fully applicable to many business jobs today instead of learning how to deal with corporate incompetence. If companies train employees on how to deal with personnel, they will learn more of the technicalities on the job underneath a hopefully competent supervisor.