I find threads like these funny, people complain about not enough software engineering such as not knowing design patterns or the details of a java implementation. On the other hand a different group complain about the lack of of 'real' cs, which is mostly math.
Two opposing views, but they think the argument is both in their favour.
There is also a massive backlash against universities in the UK at the moment. I am seeing many 'apprentice' software developers, and freelance developers skipping university. They tell me they know more than the average university graduate. So I ask them about complexity? Nope. Predicate Logic? Nope. These are things taught in even the lowest polytechnic school and they don't know them. They don't know the bounds of their own ignorance because they think cs is just knowing a few programming languages, whipping up a few apps and websites. They know more languages the average graduate? Maybe. Can they do software engineering? The easy business stuff. Do they know cs? No. I think they have never been exposed to cs and don't how deep it goes.
I've seen the accredited software developer apprenticeship curriculum and it's mostly java and design patterns.
I think the focus on apprenticeships at the moment is creating a generation of software developers who don't know what NP-Complete means.
Computer science is not whipping up applications with fancy design patterns, that's why you're better than them. Give them a few years and their cs background + Software engineering experience will begin to shine.
I attribute that confusion to the fact that the term "Computer Science" is used as a facade for both Computing Science (a branch of mathematics), and the skills necessary to create and deploy software systems (which most of the time doesn't even require "engineering" math).
It's like grouping "Industrial Design" and "Theoretical Physics" under the term "Physics", if "Industrial Design" also included "Product Managment".
> I ask them about complexity? Nope. Predicate Logic? Nope. These are things taught in even the lowest polytechnic school and they don't know them.
Depending on how the university works, students can cheat on homework then barely the exam (thanks to easy, formulaic questions that they can cram the night before), and still get decent marks. The university will compensate, with harder homework and an easier exam the next year.
I wonder if this trend will be contained to the UK looking forward. With MITx et al, in my opinion, this could be the beginning of an interesting trend. Technology has only begun to affect education.
> An easier way is to water down the educational system to a lower standard and then peg the university income to the number of students accepted while reducing the funding per head. In that way universities are given the happy choice of losing money and enforcing redundancies or watering down their requirements.
This has also become true at many US universities.
You can't water down the requirements and maintain placement stats at the same level. Many companies will simply pull the plug on recruiting and hiring once they have a bad experience with lame recent grads.
The really sad thing: The universities take the student's money, and then leave the kid unemployed at the other end. All that student debt is not dischargable in bankruptcy.
I don't know how elite US universities were. In the UK very very few people went and pretty much guaranteed middle or upper class life. One of reasons we boosted universities in the UK was because USA was sending a lot to univeristy. However it ended the guarantee of middle or upper class life.
They were education path of the elite, but we let a few working class smart people in via the grammar schools(Elite state schools) to keep the majority happy. Grammar schools selected at 11. The other way to get in was private schools, if you had the money. My ex-school is now an ex-grammar school because selection was banned for state schools. The people who did not pass selection went on to vocational training at 11.
Note - Working class in the UK is roughly the same as lower middle class US.
The university system in UK is still one of the best, were small country and have many universities in the top 100. However it's gone down.
See also this recently published book for more on the problem of publishing in academic journals that nobody reads: Planned Obsolescence:
Publishing, Technology, and the Future of the Academy
"Here are two ideas Fitzpatrick proposes to kill for good: Peer review is necessary to maintaining the credibility of scholarly research and the institutions that support it; and publishing activity in peer-reviewed journals is the best gauge of a junior professor’s contribution to knowledge in her field."
"Little in graduate school or on the tenure track inculcates helpfulness,” she writes, “and in fact much militates against it."
"But to the extent that individual academics continue in their lust for “power and prestige” by vying for exclusive spots in elite journals, they should not be surprised to find themselves as irrelevant and moribund — indeed, zombie-like — as print monographs have already become, warns Fitzpatrick."
“If we enjoy the privileges that obtain from upholding a closed system of discourse sufficiently that we’re unwilling to subject it to critical scrutiny, we may also need to accept the fact that the mainstream of public intellectual life will continue, in the main, to ignore our work,” she says. “Public funds will, in that case, be put to uses that seem more immediately pressing than our support.”
This aligns with my experience as a British student who received a first class degree from a fairly well-respected program.
The reality was that the teaching was uniformly mediocre, I remained pretty clueless about the subject material, and I produced so-so quality work. I should have worked harder for my own curiosity, but there was a complete lack of external motivation because the academic standards were so low.
To paraphrase someone's quote: I wouldn't recommend a club that had me as a member.
I think one of Groucho Marx's other quotes is probably more applicable to the sentiment of the article; "Those are my principles, and if you don't like them... well, I have others."
I'm in a similar situation. I graduated a few years ago from a good British university and got a first, but I frequently look back and wonder how I managed it.
I distinctly remember arriving and working hard for the first semester, I got 80-something percent in one course and 90+ in the other and had this awful realisation that I could glide through the first 2 years (of a 4 year program) where only a 50% passing grade was needed.
By the third year the courses did become much harder but only relatively. Whilst money doesn't seem the right way to restrict entry I do feel like they could have enforced higher entry standards, especially when I saw only ~30% of my fellow students make it to graduation.
I have experienced two degrees in two different universities in the UK. First was a standard undergraduate course finishing in 1992, a joint BSc in Economics and CS. Course content was fine (complexity, algorithms, compilers, database design etc).
However the students attitude was poor. As an undergrad straight from school I did not self learn, did not study - generally not motivated. Left in recession and with poor CS skills could not find work so delivered pizzas, worked the call centres, manned beach car park huts - all fun however.
Second degrees was a 1 year MSc in Software Engineering - this time focusing on the softer skills around methodologies, design, large systems modularity etc.
This time the student was motivated. I read, studied, wrote C++ and Java programs, delivered tasks on time and used my previous economics knowledge to build a dissertation on neural networks analysis of wholesale electricity prices. I left and went straight into a C/C++ job helping to build a mobile phone network planning system.
On both occasions the tools, environment and time were available. Just the student attitude differed. I saw what I wanted and went for it - just not the first time.
Not just in Britain. The grade "scaling" or inflation is rampant in the US, but not nearly as bad as the insane focus on research. The number one job of a professor should be to teach the students of the college.
Research is teaching; teaching the graduate students how to do research.
I agree with your sentiment that college profs should care about teaching, but to be honest too much is asked of them. They're expected to be excellent researchers, inexhaustible grant writers, engaging teachers, inspiring mentors, and part-time entrepreneurs. Those who can achieve 4 of the 5 are still impressive, and more often then not somethings got give.
Then none of the people in top universities would be there. Talented researchers go to top-tier universities for their PhD because they want to work on research professionally. They do not wish to teach, they are forced to do it because academia is, for the most part, the only place left where someone can be a professional researcher. Although there are still corporate research labs around, the days of Shannon developing Information Theory at Bell Labs are long gone.
Quite frankly, I'm fine with this setup. If you want to have a good undergraduate education provided to you by the professors, go to a school where they have that as a priority. If you want to get involved in pushing the boundaries of knowledge, then go to a place like MIT or Stanford and get involved in an undergraduate research program.
1) Professors are responding to institutional and student incentives: institutions reward research, so professors prioritize research over teaching.
2) Students are, for the most part, choosing easier degrees; as a result, those degrees are prospering, and many non-easy disciplines are watering down their criteria to attract remaining students.
3) The only real impetus for change that I can see comes from two areas: a) the amorphous group of employers who want better outcomes—but they have very little leverage and b) graduates who are unhappy to discover they have a great deal of debt and few marketable skills. Again, however, they have little leverage.
In the US the tying of funding to results is accomplished by market forces (which also favour, say, campus facilities) whereas in Britain it's administered directly by the government (which also favours "research", that is, publication count).
But since both place a substantial weight on "graduation rate", the same effect is seen vis-à-vis teaching quality.
Two types of schools that I have seen in the U.S., diploma mills, and failure mills. I went to school in a fail as many as you can school. Almost every class I walked into was 50-70% fail by design. I learned a lot, mainly studying on my own. The focus on research vs. teaching was just as strong.
Professors will do or become whatever is required of them to achieve tenure. The number on job of a professor is therefore to meet the tenure requirements of the institution - which at most places, means "do lots of research". Change the reward system and you can change how things work.
"I've never worked out whether I was, in American terms, an assistant professor or an associate professor."
Generally speaking, British lecturers are equiv. to associate professors, and assistant professors are untenured.
Most post-docs in proper academic departments are assistant professors, but this is not a hard and fast rule; post-docs in tenure-track programs nearly always are assistant professors.
I graduated from one of the top 5 British universities in the mid 90s with a 2:1 degree in History, and I have to say that it was one of the easiest thing I ever did. I had class and lecture time of 4-6 hours a week, and spent another 8 hours reading and writing essays. The essay requirement, which contributed almost half to the degree grade, was only about 12 3,000 word essays a year. I had so much free time that I turned my part time job into a full time job, working 35 hours a week.
Of course, I should have worked harder, and I would have learned more if I had, but I was 19 and 20 years old and it was just so easy to get a good degree without working at all. I know I was far from alone in this.
Every degree course is not the same, and no doubt others may have worked long hours to earn theirs, but my own experience has left me with no respect for UK degrees, to the extent that when I read CVs from candidates I consider a third level degree irrelevant.
The variability in UK university courses is pretty extreme. I have three degrees: a bachelors in mathematics ("pass"), a bachelors in accountancy (2:1) and a masters in statistics ("pass", though I averaged a "merit" I missed out because I was inconsistent), all at different universities. While I couldn't even manage a lowly third for my mathematics undergrad I worked far harder for it than the other two, despite the fact that it was the only one I studied for full-time.
Before starting the masters in statistics I took some undergraduate (first/second year) maths/stats courses at the same university (while still studying for the accountancy degree and working 35 hrs/week) and the exams and assignments were probably easier than anything I saw in the first two weeks of my maths degree. Based on my performance on those courses and what I saw of the 3rd year undergrad courses (the masters students shared one with the undergrads, though we had a different exam) I would have cruised to a first without breaking a sweat, a pretty different experience to when I was struggling to pass my degree at all!
It does make it almost impossible to judge anybody's degree and grade unless you are already familiar with the course, the toughest courses are genuinely difficult but the easiest courses are a walk in the park for anyone of reasonable ability.
Two and a half years ago I made the decision not to go to University and so far I think it was one of the best decisions I've made. I have learnt a lot more through experience (I'm a freelance iOS developer). I have also learnt about life quicker. After living with some of my friends who went to University I quickly realised how easy they have it. A few hours of classes a week (depending on the course) and very little studying outside of class. They also have everything paid for them through student loans and grants. On the other had I had to work hard and ensure my business succeeded or I wouldn't make rent.
University in the UK has been something that is just 'what people do'. Most people coming out of A-levels wouldn't even consider not going to University, especially because High Schools push it so hard (as it makes them look good). There needs to be more education in High Schools about the option of not attending University.
Learning how to program in IoS is such a small part of software engineering, and probably does not even constitute computer science. Simply put, you don't know, what you don't know.
You're right, and I think you made the right choice. I'm beginning to regret my decision to go to university. The intellectual grade of my colleagues is shocking, a significant proportion can't even write a paragraph properly, and I'm constantly carrying all the slack on group projects.
I'm a final year student on an Information Systems degree, and one of the modules I'm currently taking, "Web Application Development", is nothing more than an introduction to HTML and CSS. In fact we don't even have to produce anything aside from a few snippets. I feel like I've wasted my time and been ripped off, if I'm ever in the position to hire I would never employ someone based on a degree alone.
This was my experience doing Computer Science at a British University, too.
Most of the modules which I took were so watered down that they were absolutely useless to me. I knew this and I was pretty depressed at the time. I'm not very good at doing mindless work: some of the lectures I just stopped going to and other times I completely ignored the vacuous assignments I was being asked to do.
Looking back I wish I had dropped out and gone straight into a job with the programming skills I was teaching myself. (But I guess if I had done this I might not have learnt about fundamental CS concepts.)
I actually love learning but like to do it in my own way, on my own accord. I'm thinking of taking those online Stanford classes which are starting soon -- I guess the only thing that is missing from these is human conversation. I wonder if one day people might informally meet for coffee to discuss the online courses they're taking together. ;)
This gives me a better insight as to why my CS degree is a piece of shit and why I learn not much more than I already knew or that I use on a regular basis.
I'm Canadian not British, but I do relate to everything that was said in this article. I completed my degree in 2004.
This is, in my opinion, entirely the government's fault.
The government decides how much to fund universities based on publication quality, which they rate based on the journals the papers are accepted into.
There is almost no benefit to teaching students better, and there are huge advantages to passing students who would otherwise fail. This is because universities have a strict limit on the number of students they can accept, and these are not replaced if students fail their first year.
So, to maximise income universities have to keep hold of students, while getting as many papers as possible into high quality journals.
I graduated in Computer Science in 2004 from a very well respected British University. I graduated with a first. There is usually a very clearly signposted path to getting a "good degree" without necessarily having to know all that much core computer science.
My course was a four year course. The total weighting for each year was 10-20-3-40 (years 1-4). The first two years had non-optional, core CS modules (algorithms, logic, discrete maths, etc.) and the final two years had a lot of electives. If you could muddle through the first two years, you could take a series of electives in the final two years (foreign languages, Accounting and finance, etc.) that were agruably much easier.
I got mediocre grades in the first two years, but good grades in the final two years, resulting in a first class degree overall.
I regret my choices, but as a lazy undergrad, I took the path of least resistance to achieve my target (a first class degree). I was not the only one who did this. The problem is that people like me made the University look good, so I think they made it very easy to game the System. The only thing I worked really hard on were the programming assignments and projects. The exams were easy to pass provided since they had a very clearly laid out pattern, and questions tended to be repeated year and after year. If you could solve exam questions from the last 3-4 years before your final exam, you would probably ace it.
You know, I'm getting pretty sick of all of these "doom and gloom" stories about the modern higher education system.
Yes, the modern higher education system is not ideal. But what, in life, really is? That's not to say that we shouldn't pursue a better system, but we shouldn't give up on a system just because it's not ideal.
And with all of these doom and gloom stories, I have yet to see anyone offer an alternative. Yeah, there's a lot of paperwork involved in being a professor, and you get evaluated on criteria that don't quite line up with the ideal for being a great professor. But what would be better? How can we create a better system? And if there's such an obvious better answer, why doesn't someone do it?
If there is some obviously better system, I'd love to see it. If such a thing exits, it should be quite competitive with the current higher education system. No one wants to hire incompetent new graduates. No one wants to be one. So we should see something better, something that indicates there is some better way of doing this.
Instead, we see a steady stream of technological progress. I can do things that no one could do before, like carrying a device around that allows me to pinpoint my precise location, stream maps down to me, find me directions to wherever I want to go, read those directions aloud in a synthesized voice, all for the price of of 2% of median yearly income (including hardware, software, and the service). And that, of course, is not to mention all of the other things that are available to me.
Now, maybe I'm living in a bubble, built by people who got a proper education before all of this grade inflation and other nonsense. But really, this article is complaining about the last 20 years. A large portion of the people who are doing work in technical fields finished school within the last 20 years. And yet, we're still seeing significant progress; we are still living in a world that is tumbling into the future at a high rate.
So I want to know two things. For one, why are we still progressing so quickly, despite these apparent problems? How are we managing to innovate, if our educational and academic foundation is so unsound?
And for another, what is the solution? What do you propose we do better? If it's so much obviously better, why don't we do it? Or why doesn't someone, somewhere do it, and show significantly better results?
I think one reason is that when you are doing work in the top few percent of human ability, you look around and realize how ordinary it is. Even being at the top, everyone has their flaws. No one is perfect. Systems designed to prevent people from cheating also prevent some people from doing amazing work. But overal, it isn't a few geniuses at the level of "Mozart" that we need; it's a lot of people, doing work at a high level, but not what some might consider "genius." If you are immersed in it seems somewhat boring, but when it all adds up, it winds up opening new possibilities that were simply not available 10, 20, or 30 years ago.
What's worse is that the article is stuffed full of hyperbolic crap about Maoism and misses some of the more valid criticisms of UK education system (the transformation of technical colleges offering effective vocational training for non-academically inclined people into degree awarding bodies which felt the need to adjust their course content to match; the objective of the previous administration to get 50% of school-leavers into university which inevitably resulted in the reintroduction of fees to pay for it, skewing the entrance pool)
in favour of arguments that are dubious at best. In effect, he's blaming the government for flaws that sit squarely on the shoulders of the academics themselves.
If you believe the author, the worst thing that has happened to the education system is the requirement that lecturers actually produce academic output (due to the "envy" of the taxpayer subsidising them). As the author points out, some of this is less than seminal, but frankly I don't find the implicit argument that the same academics would miraculously produce more valuable contributions to the world if not shackled by the requirement they actually justify the money being thrown in their direction. I've read some decidedly mediocre papers written before academics were obliged to get things published on a regular basis.
Also he complains about modularisation, because apparently higher education students aren't smart enough to choose their own areas of specialization. Sorry, but if the University of Leeds' course on webdesign in the 1990s was too easy, it's because the academics running the course were slow to embrace and understand the potential complexities of the web, not the fault of the bleddy gubment. Just be glad they didn't make it a "core" subject that everyone gets high marks on.
I've been hanging around in higher ed for around a decade. I recently got interviewed for an assistant prof position. Didn't get it, and the feedback I got as to why I didn't get it was frankly bizzare. The job itself would have been a really nasty one too. Kill you with teaching and expect a proper research output too. Glad I didn't get it. Don't get me wrong, if the university system decides that either they want me to do a nice job without all the performance management crap they usually force people to do (actually slightly likely in the near term in my case), or they decide that they need me more than I need them (long term possibility) then I'll be happy to have them. Otherwise, I might do a little bit of teaching to keep library access and go and do proper paid contract work instead for a while.
The higher ed system looks in quite serious trouble to me right now and I do think is heading towards a transformative crisis in the medium term.
> "But what would be better? How can we create a better system? And if there's such an obvious better answer, why doesn't someone do it?"
Here's a simple answer: The better system exists, but people overlook it.
As a student, first get a Bachelor's degree.
Second, pick a field and be sure have learned it well at at least the Bachelor's level. Do this learning independently if necessary. A Bachelor's degree is supposed to teach you how to do at least this much learning independently.
Third, from that learning about the field you selected, learn some more, to 'the next level'. Likely do this independently.
Fourth, show up at any one of the better research universities and take the Ph.D. qualifying exams based on what you learned.
Fifth, stick around that university and attend some seminars and courses that are introductions to research given by experts in research. Here your work is largely independent.
Sixth, pick a research problem and get some good results, independently or nearly so. If there is any doubt about the significance of your research, then publish it.
Seventh, submit your research as your Ph.D. dissertation.
Congratulations: You are now out of school; you went all the way; you are educated. Done.
I experienced how poorly CS classes prepared students for jobs in Software Engineering (I do realize they're not the same thing, but that's obviously the main degree we look for). I interviewed people who have a masters degree with an emphasis in Java, yet they were unaware of even the simplest details about how the JVM works (implementation details of String class, JIT compilation).
I felt bad for this person, I wonder if it's too late for them to get refund on that degree because it sure as heck didn't increase their earning potential.
I too have a degree in CS, which required me to 'play' with Java. Most students on the course did not have the basic concepts of Strings and ints let alone JIT. Most left with the ability to "avoid programming and CLI at all cost" and are still working in jobs they could have done without a degree paying £16-21K with zero career progression and / or training.
Final year CS student in a British University here.
I can certainly see where the author's coming from, although I don't find the situation this dire. I'm not British so I don't know that much about how Universities worked and were perceived in society in the past and I guess I might have a slightly different mindset. Anyway, let me explain myself.
We have this modular course structure at my school. Yes, you can pick courses varying from "Developing web applications with Java" to hard CS stuff like compiler design and advanced algorithms. As far as I can tell, no course has been dropped because it was too hard - because students like challenges and take them. The same goes for final year projects, I've seen a student saying that she won't be doing any programming for her project (yeah, WTF), but I also know of more serious engineering projects (like a guy refreshing the electronics and more importantly, the software of a popular home-build 3D printer - and using the capabilities to do some stuff I'd really like to have on my 3d printer), or more experimental projects like mining Twitter for medical drug information (like perceived effectiveness, side-effects, usage patterns, etc).
What I'm trying to say is that there might be some easy paths you could take, but the students which always pick the easiest choice are usually the ones who end failing or dropping out. Sure, some of them graduate and I have mixed feelings about having the same degree as some of my fellow students. The author says that ' By pre-1990 standards about 20% of the students should have been failed.'. Well, in my school about 20% of the students are failed - each year.
Another topic is grade scaling. Yes, most lab and coursework grades are scaled in the first year and some in the second year. Exams are never scaled! But here's the thing, scaling is always down. It can be argued that labs are too easy if you need to scale the grades down and it is frustrating to do a perfect job and end up with a 70 something percent mark. But grades are never scaled up to 'turn a fail into a II'.
Finally, some people argue that a formal CS education is useless and out of touch with reality. I definitely don't agree. Knowing algorithms and data structures can give you an edge even for simple programs, knowing that some research areas and approaches even exist helps you avoid a lot of easy mistakes, labs will help you design better and faster because you sort of develop your own process and you get to know common pitfalls, reports and presentations train you to communicate better and using the proper domain language, having a clear image of how computers work from the grounds up is great when you're debugging.
TL;DR: Formal CS education still useful, just more chances to shoot yourself in the foot. (Yes, I consider getting a first without learning too much to be shooting yourself in the foot.) Study the fundamentals and the hard stuff and you should be better off than a self-taught person.
It's more than 20 years since I graduated with a CS degree and I think my advice these days to anyone considering the subject would be to avoid CS unless you want to go on and do research level work in academia or industry. Of course, that's what I wanted to do before going to University and I did end up working as a researcher in academia for six years before co-founding a startup.
People keep thinking that a CS degree is a vocational training program for developers - they didn't used to be and if that's what they have turned into then it's no wonder that they are doing a terrible job.
[+] [-] UK-Al05|14 years ago|reply
Two opposing views, but they think the argument is both in their favour.
There is also a massive backlash against universities in the UK at the moment. I am seeing many 'apprentice' software developers, and freelance developers skipping university. They tell me they know more than the average university graduate. So I ask them about complexity? Nope. Predicate Logic? Nope. These are things taught in even the lowest polytechnic school and they don't know them. They don't know the bounds of their own ignorance because they think cs is just knowing a few programming languages, whipping up a few apps and websites. They know more languages the average graduate? Maybe. Can they do software engineering? The easy business stuff. Do they know cs? No. I think they have never been exposed to cs and don't how deep it goes.
I've seen the accredited software developer apprenticeship curriculum and it's mostly java and design patterns.
I think the focus on apprenticeships at the moment is creating a generation of software developers who don't know what NP-Complete means.
Computer science is not whipping up applications with fancy design patterns, that's why you're better than them. Give them a few years and their cs background + Software engineering experience will begin to shine.
[+] [-] demian|14 years ago|reply
It's like grouping "Industrial Design" and "Theoretical Physics" under the term "Physics", if "Industrial Design" also included "Product Managment".
[+] [-] wisty|14 years ago|reply
Depending on how the university works, students can cheat on homework then barely the exam (thanks to easy, formulaic questions that they can cram the night before), and still get decent marks. The university will compensate, with harder homework and an easier exam the next year.
[+] [-] wmf|14 years ago|reply
[+] [-] quizbiz|14 years ago|reply
[+] [-] buff-a|14 years ago|reply
Other people never do, do they? It explains everything. You need look no further.
[+] [-] jpdoctor|14 years ago|reply
This has also become true at many US universities.
You can't water down the requirements and maintain placement stats at the same level. Many companies will simply pull the plug on recruiting and hiring once they have a bad experience with lame recent grads.
The really sad thing: The universities take the student's money, and then leave the kid unemployed at the other end. All that student debt is not dischargable in bankruptcy.
[+] [-] UK-Al05|14 years ago|reply
They were education path of the elite, but we let a few working class smart people in via the grammar schools(Elite state schools) to keep the majority happy. Grammar schools selected at 11. The other way to get in was private schools, if you had the money. My ex-school is now an ex-grammar school because selection was banned for state schools. The people who did not pass selection went on to vocational training at 11.
Note - Working class in the UK is roughly the same as lower middle class US.
The university system in UK is still one of the best, were small country and have many universities in the top 100. However it's gone down.
[+] [-] edtechdev|14 years ago|reply
A free version is online: http://mediacommons.futureofthebook.org/mcpress/plannedobsol...
Here are some quotes from an interview w/the author from http://www.insidehighered.com/news/2011/09/30/planned_obsole...
"Here are two ideas Fitzpatrick proposes to kill for good: Peer review is necessary to maintaining the credibility of scholarly research and the institutions that support it; and publishing activity in peer-reviewed journals is the best gauge of a junior professor’s contribution to knowledge in her field."
"Little in graduate school or on the tenure track inculcates helpfulness,” she writes, “and in fact much militates against it."
"But to the extent that individual academics continue in their lust for “power and prestige” by vying for exclusive spots in elite journals, they should not be surprised to find themselves as irrelevant and moribund — indeed, zombie-like — as print monographs have already become, warns Fitzpatrick."
“If we enjoy the privileges that obtain from upholding a closed system of discourse sufficiently that we’re unwilling to subject it to critical scrutiny, we may also need to accept the fact that the mainstream of public intellectual life will continue, in the main, to ignore our work,” she says. “Public funds will, in that case, be put to uses that seem more immediately pressing than our support.”
[+] [-] frou_dh|14 years ago|reply
The reality was that the teaching was uniformly mediocre, I remained pretty clueless about the subject material, and I produced so-so quality work. I should have worked harder for my own curiosity, but there was a complete lack of external motivation because the academic standards were so low.
To paraphrase someone's quote: I wouldn't recommend a club that had me as a member.
[+] [-] tankenmate|14 years ago|reply
[+] [-] adam-a|14 years ago|reply
I distinctly remember arriving and working hard for the first semester, I got 80-something percent in one course and 90+ in the other and had this awful realisation that I could glide through the first 2 years (of a 4 year program) where only a 50% passing grade was needed.
By the third year the courses did become much harder but only relatively. Whilst money doesn't seem the right way to restrict entry I do feel like they could have enforced higher entry standards, especially when I saw only ~30% of my fellow students make it to graduation.
[+] [-] rbreve|14 years ago|reply
[+] [-] 0684|14 years ago|reply
Well then, hypothetically, no institution you attend would be good enough.
[+] [-] sbarlster|14 years ago|reply
However the students attitude was poor. As an undergrad straight from school I did not self learn, did not study - generally not motivated. Left in recession and with poor CS skills could not find work so delivered pizzas, worked the call centres, manned beach car park huts - all fun however.
Second degrees was a 1 year MSc in Software Engineering - this time focusing on the softer skills around methodologies, design, large systems modularity etc.
This time the student was motivated. I read, studied, wrote C++ and Java programs, delivered tasks on time and used my previous economics knowledge to build a dissertation on neural networks analysis of wholesale electricity prices. I left and went straight into a C/C++ job helping to build a mobile phone network planning system.
On both occasions the tools, environment and time were available. Just the student attitude differed. I saw what I wanted and went for it - just not the first time.
[+] [-] cdcarter|14 years ago|reply
[+] [-] rubidium|14 years ago|reply
I agree with your sentiment that college profs should care about teaching, but to be honest too much is asked of them. They're expected to be excellent researchers, inexhaustible grant writers, engaging teachers, inspiring mentors, and part-time entrepreneurs. Those who can achieve 4 of the 5 are still impressive, and more often then not somethings got give.
[+] [-] tansey|14 years ago|reply
Quite frankly, I'm fine with this setup. If you want to have a good undergraduate education provided to you by the professors, go to a school where they have that as a priority. If you want to get involved in pushing the boundaries of knowledge, then go to a place like MIT or Stanford and get involved in an undergraduate research program.
[+] [-] jseliger|14 years ago|reply
1) Professors are responding to institutional and student incentives: institutions reward research, so professors prioritize research over teaching.
2) Students are, for the most part, choosing easier degrees; as a result, those degrees are prospering, and many non-easy disciplines are watering down their criteria to attract remaining students.
3) The only real impetus for change that I can see comes from two areas: a) the amorphous group of employers who want better outcomes—but they have very little leverage and b) graduates who are unhappy to discover they have a great deal of debt and few marketable skills. Again, however, they have little leverage.
[+] [-] waqf|14 years ago|reply
But since both place a substantial weight on "graduation rate", the same effect is seen vis-à-vis teaching quality.
[+] [-] Lost_BiomedE|14 years ago|reply
[+] [-] pmb|14 years ago|reply
[+] [-] chalst|14 years ago|reply
Generally speaking, British lecturers are equiv. to associate professors, and assistant professors are untenured.
Most post-docs in proper academic departments are assistant professors, but this is not a hard and fast rule; post-docs in tenure-track programs nearly always are assistant professors.
[+] [-] john_flintstone|14 years ago|reply
Of course, I should have worked harder, and I would have learned more if I had, but I was 19 and 20 years old and it was just so easy to get a good degree without working at all. I know I was far from alone in this.
Every degree course is not the same, and no doubt others may have worked long hours to earn theirs, but my own experience has left me with no respect for UK degrees, to the extent that when I read CVs from candidates I consider a third level degree irrelevant.
[+] [-] nick_dm|14 years ago|reply
Before starting the masters in statistics I took some undergraduate (first/second year) maths/stats courses at the same university (while still studying for the accountancy degree and working 35 hrs/week) and the exams and assignments were probably easier than anything I saw in the first two weeks of my maths degree. Based on my performance on those courses and what I saw of the 3rd year undergrad courses (the masters students shared one with the undergrads, though we had a different exam) I would have cruised to a first without breaking a sweat, a pretty different experience to when I was struggling to pass my degree at all!
It does make it almost impossible to judge anybody's degree and grade unless you are already familiar with the course, the toughest courses are genuinely difficult but the easiest courses are a walk in the park for anyone of reasonable ability.
[+] [-] tomaskafka|14 years ago|reply
[+] [-] waqf|14 years ago|reply
[+] [-] k-mcgrady|14 years ago|reply
University in the UK has been something that is just 'what people do'. Most people coming out of A-levels wouldn't even consider not going to University, especially because High Schools push it so hard (as it makes them look good). There needs to be more education in High Schools about the option of not attending University.
[+] [-] UK-Al05|14 years ago|reply
[+] [-] highace|14 years ago|reply
I'm a final year student on an Information Systems degree, and one of the modules I'm currently taking, "Web Application Development", is nothing more than an introduction to HTML and CSS. In fact we don't even have to produce anything aside from a few snippets. I feel like I've wasted my time and been ripped off, if I'm ever in the position to hire I would never employ someone based on a degree alone.
[+] [-] lhnz|14 years ago|reply
Most of the modules which I took were so watered down that they were absolutely useless to me. I knew this and I was pretty depressed at the time. I'm not very good at doing mindless work: some of the lectures I just stopped going to and other times I completely ignored the vacuous assignments I was being asked to do.
Looking back I wish I had dropped out and gone straight into a job with the programming skills I was teaching myself. (But I guess if I had done this I might not have learnt about fundamental CS concepts.)
I actually love learning but like to do it in my own way, on my own accord. I'm thinking of taking those online Stanford classes which are starting soon -- I guess the only thing that is missing from these is human conversation. I wonder if one day people might informally meet for coffee to discuss the online courses they're taking together. ;)
[+] [-] kmfrk|14 years ago|reply
[+] [-] remyroy|14 years ago|reply
I'm Canadian not British, but I do relate to everything that was said in this article. I completed my degree in 2004.
Great read.
[+] [-] InclinedPlane|14 years ago|reply
[+] [-] CJefferson|14 years ago|reply
The government decides how much to fund universities based on publication quality, which they rate based on the journals the papers are accepted into.
There is almost no benefit to teaching students better, and there are huge advantages to passing students who would otherwise fail. This is because universities have a strict limit on the number of students they can accept, and these are not replaced if students fail their first year.
So, to maximise income universities have to keep hold of students, while getting as many papers as possible into high quality journals.
[+] [-] rluhar|14 years ago|reply
My course was a four year course. The total weighting for each year was 10-20-3-40 (years 1-4). The first two years had non-optional, core CS modules (algorithms, logic, discrete maths, etc.) and the final two years had a lot of electives. If you could muddle through the first two years, you could take a series of electives in the final two years (foreign languages, Accounting and finance, etc.) that were agruably much easier.
I got mediocre grades in the first two years, but good grades in the final two years, resulting in a first class degree overall.
I regret my choices, but as a lazy undergrad, I took the path of least resistance to achieve my target (a first class degree). I was not the only one who did this. The problem is that people like me made the University look good, so I think they made it very easy to game the System. The only thing I worked really hard on were the programming assignments and projects. The exams were easy to pass provided since they had a very clearly laid out pattern, and questions tended to be repeated year and after year. If you could solve exam questions from the last 3-4 years before your final exam, you would probably ace it.
[+] [-] lambda|14 years ago|reply
Yes, the modern higher education system is not ideal. But what, in life, really is? That's not to say that we shouldn't pursue a better system, but we shouldn't give up on a system just because it's not ideal.
And with all of these doom and gloom stories, I have yet to see anyone offer an alternative. Yeah, there's a lot of paperwork involved in being a professor, and you get evaluated on criteria that don't quite line up with the ideal for being a great professor. But what would be better? How can we create a better system? And if there's such an obvious better answer, why doesn't someone do it?
If there is some obviously better system, I'd love to see it. If such a thing exits, it should be quite competitive with the current higher education system. No one wants to hire incompetent new graduates. No one wants to be one. So we should see something better, something that indicates there is some better way of doing this.
Instead, we see a steady stream of technological progress. I can do things that no one could do before, like carrying a device around that allows me to pinpoint my precise location, stream maps down to me, find me directions to wherever I want to go, read those directions aloud in a synthesized voice, all for the price of of 2% of median yearly income (including hardware, software, and the service). And that, of course, is not to mention all of the other things that are available to me.
Now, maybe I'm living in a bubble, built by people who got a proper education before all of this grade inflation and other nonsense. But really, this article is complaining about the last 20 years. A large portion of the people who are doing work in technical fields finished school within the last 20 years. And yet, we're still seeing significant progress; we are still living in a world that is tumbling into the future at a high rate.
So I want to know two things. For one, why are we still progressing so quickly, despite these apparent problems? How are we managing to innovate, if our educational and academic foundation is so unsound?
And for another, what is the solution? What do you propose we do better? If it's so much obviously better, why don't we do it? Or why doesn't someone, somewhere do it, and show significantly better results?
I think one reason is that when you are doing work in the top few percent of human ability, you look around and realize how ordinary it is. Even being at the top, everyone has their flaws. No one is perfect. Systems designed to prevent people from cheating also prevent some people from doing amazing work. But overal, it isn't a few geniuses at the level of "Mozart" that we need; it's a lot of people, doing work at a high level, but not what some might consider "genius." If you are immersed in it seems somewhat boring, but when it all adds up, it winds up opening new possibilities that were simply not available 10, 20, or 30 years ago.
[+] [-] notahacker|14 years ago|reply
If you believe the author, the worst thing that has happened to the education system is the requirement that lecturers actually produce academic output (due to the "envy" of the taxpayer subsidising them). As the author points out, some of this is less than seminal, but frankly I don't find the implicit argument that the same academics would miraculously produce more valuable contributions to the world if not shackled by the requirement they actually justify the money being thrown in their direction. I've read some decidedly mediocre papers written before academics were obliged to get things published on a regular basis.
Also he complains about modularisation, because apparently higher education students aren't smart enough to choose their own areas of specialization. Sorry, but if the University of Leeds' course on webdesign in the 1990s was too easy, it's because the academics running the course were slow to embrace and understand the potential complexities of the web, not the fault of the bleddy gubment. Just be glad they didn't make it a "core" subject that everyone gets high marks on.
[+] [-] singingfish|14 years ago|reply
The higher ed system looks in quite serious trouble to me right now and I do think is heading towards a transformative crisis in the medium term.
[+] [-] HilbertSpace|14 years ago|reply
Here's a simple answer: The better system exists, but people overlook it.
As a student, first get a Bachelor's degree.
Second, pick a field and be sure have learned it well at at least the Bachelor's level. Do this learning independently if necessary. A Bachelor's degree is supposed to teach you how to do at least this much learning independently.
Third, from that learning about the field you selected, learn some more, to 'the next level'. Likely do this independently.
Fourth, show up at any one of the better research universities and take the Ph.D. qualifying exams based on what you learned.
Fifth, stick around that university and attend some seminars and courses that are introductions to research given by experts in research. Here your work is largely independent.
Sixth, pick a research problem and get some good results, independently or nearly so. If there is any doubt about the significance of your research, then publish it.
Seventh, submit your research as your Ph.D. dissertation.
Congratulations: You are now out of school; you went all the way; you are educated. Done.
[+] [-] shanemhansen|14 years ago|reply
I felt bad for this person, I wonder if it's too late for them to get refund on that degree because it sure as heck didn't increase their earning potential.
[+] [-] ugyuyguy|14 years ago|reply
[+] [-] ig1|14 years ago|reply
[+] [-] RyanMcGreal|14 years ago|reply
Nitpick: this is not a Chinese saying.
http://en.wikipedia.org/wiki/May_you_live_in_interesting_tim...
[+] [-] noderivative|14 years ago|reply
[+] [-] lgeek|14 years ago|reply
I can certainly see where the author's coming from, although I don't find the situation this dire. I'm not British so I don't know that much about how Universities worked and were perceived in society in the past and I guess I might have a slightly different mindset. Anyway, let me explain myself.
We have this modular course structure at my school. Yes, you can pick courses varying from "Developing web applications with Java" to hard CS stuff like compiler design and advanced algorithms. As far as I can tell, no course has been dropped because it was too hard - because students like challenges and take them. The same goes for final year projects, I've seen a student saying that she won't be doing any programming for her project (yeah, WTF), but I also know of more serious engineering projects (like a guy refreshing the electronics and more importantly, the software of a popular home-build 3D printer - and using the capabilities to do some stuff I'd really like to have on my 3d printer), or more experimental projects like mining Twitter for medical drug information (like perceived effectiveness, side-effects, usage patterns, etc).
What I'm trying to say is that there might be some easy paths you could take, but the students which always pick the easiest choice are usually the ones who end failing or dropping out. Sure, some of them graduate and I have mixed feelings about having the same degree as some of my fellow students. The author says that ' By pre-1990 standards about 20% of the students should have been failed.'. Well, in my school about 20% of the students are failed - each year.
Another topic is grade scaling. Yes, most lab and coursework grades are scaled in the first year and some in the second year. Exams are never scaled! But here's the thing, scaling is always down. It can be argued that labs are too easy if you need to scale the grades down and it is frustrating to do a perfect job and end up with a 70 something percent mark. But grades are never scaled up to 'turn a fail into a II'.
Finally, some people argue that a formal CS education is useless and out of touch with reality. I definitely don't agree. Knowing algorithms and data structures can give you an edge even for simple programs, knowing that some research areas and approaches even exist helps you avoid a lot of easy mistakes, labs will help you design better and faster because you sort of develop your own process and you get to know common pitfalls, reports and presentations train you to communicate better and using the proper domain language, having a clear image of how computers work from the grounds up is great when you're debugging.
TL;DR: Formal CS education still useful, just more chances to shoot yourself in the foot. (Yes, I consider getting a first without learning too much to be shooting yourself in the foot.) Study the fundamentals and the hard stuff and you should be better off than a self-taught person.
[+] [-] arethuza|14 years ago|reply
People keep thinking that a CS degree is a vocational training program for developers - they didn't used to be and if that's what they have turned into then it's no wonder that they are doing a terrible job.