top | item 2849849

Lost skills: What today's coders don't know and why it matters

61 points| jfruh | 14 years ago |itworld.com | reply

74 comments

order
[+] schrototo|14 years ago|reply
Kris Rudin, Senior Developer and Associate Partner at digital marketing agency Ascentium says, "One 'lost skill' that I see all the time with new developers -- how to debug without an integrated debugger. When I started programming (in 1986, just after punch cards, using a mainframe & dumb terminal), we didn't have any IDEs and debuggers, we had to put trace statements in our code to track values and execution.

"Today," says Rudin, there are occasionally times with you can't use the integrated debugger in your IDE (usually with some weird web application frameworks and server configurations), and younger programmers are at a loss as to what to do, and resort to hack-and-slash coding to try to randomly fix a bug, using guesswork. Me, I just calmly put in some code to display output values on the web page, find the bug, and fix it."

Oh come on. What he's describing isn't some "lost skill". If you can't figure out to use print statements for debugging you're not much of a programmer in the first place.

Most statements in that article describe really basic, common sense stuff.

[+] narcissus|14 years ago|reply
I work a lot in PHP, so it's kind of funny that I find that new developers in PHP are exactly the opposite: most of them don't know how to use IDE debuggers with PHP. Instead they have print statements all through their code and they end up veering away from any form of OO or 'complex' code, as it becomes hard to 'debug' this code when there's a problem.

Don't get me wrong: 'debugging by echo' is useful at times, but I would say that 9 times out of 10, a proper debugger is more valuable. In my case. YMMV. etc.

[+] k4st|14 years ago|reply
I think that it's useful to distinguish between code that one has an intimate knowledge of / intuitive sense of and code where one does not fully understand.

For example, I find printf-like debugging and using stack traces from Valgrind to be very useful for debugging my own code; however, at work, when I have no idea where the problem is, something like gdb or the debugger of Visual Studio really comes in handy.

[+] cpeterso|14 years ago|reply
In the proprietary, NIH world of embedded software, vendor often ship development tools whose quality ranges from flaky to totally broken. With tight deadlines to ship devices that will be obsolete within months, investing time and people to support robust dev tools across multiple platforms is deemed a low priority. printf() debugging is often the only reliable (and portable) option. :(
[+] skrebbel|14 years ago|reply
I'm sorry, but an article that starts with an authority argument as follows:

> Bernard Hayes, PMP (PMI Project Mgmt Professional), CSM (certified Scrum Master), and CSPO (certified Scrum Product Owner).

makes me not want to continue reading much further.

To clarify: you become a CSM and a CSPO by taking a 2-day course each (well ok, there's an online exam, but you can look up the answers on wikipedia and pass). PMP is the only certification in that last that takes some effort, but it's totally unrelated to software and, thus, to most of this article's subject matter.

[+] unknown|14 years ago|reply

[deleted]

[+] quanticle|14 years ago|reply
>Other low-level skills that today's engineers aren't getting, according to Carl Mikkelsen: "programming tight loops, computational graphic approximations like a circle algorithm, machining cast iron, designing CPU instruction sets, meeting real-time constraints, programming in the inefficient zone - recovering some of the last 30% of potential, and analog design, such as audio amps."

What, do they also expect us to cut, polish and etch our own silicon wafers as well?

[+] mechanical_fish|14 years ago|reply
For the record: Everyone who calls themself an engineer should have tried to machine cast iron (or some kind of metal; better to start with aluminum) at least once. And everyone should know how silicon is made, hands-on if possible (which it generally isn't; fabs are expensive).

Why? Because these things are awesome, and form the foundation of our culture.

Now, having said that: No, having hands-on experience with these things doesn't really help with programming. ;)

[+] ColinWright|14 years ago|reply
It quotes our own bensummers:

    Ben Summers, Technical Director at ONEIS, a U.K-based
    information management platform provider, points out
    that "habits learned when writing web applications for
    14.4kbps dial-up telephone modems come in rather handy
    when dealing with modern day mobile connections. When
    you only had couple of Kbytes per second, and latencies
    of a few hundred milliseconds, you were very careful to
    minimize the size of the pages you sent, and just as
    importantly, minimize the amount of back and forth with
    the server."

    With today's mobile connections, says Summers, "the
    latency is much worse than using a telephone modem
    connection, and that's compounded by error rates in
    congested areas like city centers. The fast 'broadband'
    headline speeds are pretty irrelevant to web applications.
    It's the latency which determines how fast the response
    time will feel, and tricks learned when phone modems
    ruled the world come in awfully handy. As a bonus, when
    someone uses your app on a fixed connection, it'll feel
    as fast as desktop software!"
[+] bensummers|14 years ago|reply
I was shocked to learn that I was a "industry veteran and/or seasoned coder".
[+] dspillett|14 years ago|reply
One thing I find coders who have not been through a formal course like University (and even some that have...) lack is an understanding of basic complexity scaling issues.

And by basic I mean not understanding the different performance implications of a table scan, an index scan and an index seek in an SQL query plan. And also why "it takes ages first time, but is quick after that (when everything is already in RAM)" is usually not acceptable (every time could be the first time around if the query isn't run often or RAM is limited).

Some of the stuff that article lists is just not needed at all by a code, really. Some are strictly hardware issues. Others are oddly specific: "programming tight loops" is part of the complexity theory thing: understanding how a process will behave at relevant scales and optimising accordingly.

[+] Ixiaus|14 years ago|reply
I agree with this - but it is more than that too; I've undertaken a self-study of machine fundamentals too. That is something almost all web application programmers lack (those that don't come from EE/CS that is). By machine fundamentals, I mean how instructions are executed on the processor - how memory works - how different kinds of work can either be optimized for the CPU and the CPU's cache or how it can be optimized for memory - what the difference is between a 64bit bus and a 32bit bus - what's so special about PCI-Express and the AGP buses - how I/O actually works and what random seeks on the disk are (yay for SSDs).

The biggest gap, IMHO though, is a lack of knowledge about big O and why quadratic behavior can be bad, what it is, &c... That goes hand-in-hand with a lack of knowledge in algorithms. Why is bubble-sort considered bad? What's a generator? Why does everyone keep saying to use xrange() in python? Why is it bad to use list concatenation?

[+] kstenerud|14 years ago|reply
This smacks of a "kids these days" article.

I remember similar articles in the 80s and 90s bemoaning how "programmers these days" didn't know how to use a protocol analyzer or logic probe, or didn't know that xor made for a faster register clear operation (or moveq for 68k fans), or any other number of esoteric trivia that, while useful in context, did not usually contribute significantly towards a programmer's ability to get the job done.

My first debugger was an in-circuit-emulator for a Z80. It was the size of a small television set, had a crappy UI, and limited functionality. Today's debuggers can be hosted on the system itself, and have become so powerful that most people don't know how to use them to maximum effect (myself included). IDEs check your code as you type. No more writing something in VI, compiling, tracking down the cryptic error messages your compiler spat out and trying to figure out where the REAL error is because the compiler is dumb. You're shielded from the ugliness underneath, and for 99.9% of cases that's more than enough.

Do "kids these days" really need to know the sound of a hard drive dying? The last drive I heard going bad was in the 90s. Since then drives have become so quiet that you'd need a stethoscope to even hear the arm thrashing (which is why I use RAID). And how useful is the knowledge that you can open up a frozen drive and spin it up with your finger going to be as disks are replaced by SSDs?

We live in the future, where things have gotten a LOT better. Do the new batch of developers really need to know assembly language? After moving to "fluffy" languages, I only twice found need to use it (once to disassemble a stack dump from a JNI crash, and once to monkey patch a buggy device driver). Twice in all my years since using Java, Python, PHP, Objective-C, Scheme, COBOL, VB, and C#. Was it damn handy to have the right skill at an opportune time? Hell yeah. Does EVERYONE need this skill? Hell no.

How about bit packing? Memory has become so cheap and plentiful that even routers come with 16MB or more. Beyond low level networking and peripheral protocols, what use is there in packing up bits and coming up with clever encoding schemes that make for complicated (and potentially buggy) codec routines? Saving one byte in a packet header is hardly the triumph it once was.

All the "kids these days" need to know is their algorithms, profiling, debugging, multithreading issues, and how to write well structured, maintainable code in their paradigm of choice. The rest is usually industry specific, and can be learned as-you-go.

Don't worry about the kids. The kids are alright.

[+] zwieback|14 years ago|reply
"Assembly language, interrupt handlers, race conditions, cache coherence," says Jude Miller, long-time system consultant and industry curmudgeon.

This is what I do every day at my job and there are other people who know how to do this, plenty of them. Most of them are EE's, though, and consider SW/FW development as their secondary job.

It isn't easy to hire embedded programmers, graduates with computer engineering, EE or CS degrees will have some knowledge but it's experience more than anything that will help you acquire these skills. When I look at resumes I look for experience building small circuits, Arduino or PIC. If I don't see anything like that or if the skill list starts with Java, PHP, ... it would be kind of unfair to expect detailed knowledge about how to use "volatile" in C or how to use a scope to find race conditions.

[+] cube13|14 years ago|reply
>"Assembly language, interrupt handlers, race conditions, cache coherence," says Jude Miller, long-time system consultant and industry curmudgeon.

The funny thing about this is that the CS degree I got covered all of these to a fairly decent extent. I wasn't an expert on any of those topics when I graduated, but I had enough of a background working with them that I wasn't completely in over my head when I ran into all of that at work(I work on high-throughput, low-latency financial messaging APIs).

It's interesting work, but it's a very technical skillset, because you do need to understand all the various issues that can pop up.

[+] jmaygarden|14 years ago|reply
I agree with all your points, but just want to add that I've seen very ugly code from EE-types (I'm a EE). An outstanding board design and audio/video engineer I worked with avoided loops whenever possible cost because they "made the code confusing." However, his FPGA code was immaculate. I chalked it up to a parallel versus sequential thought process that mirrors the differences between hardware and software.
[+] arohner|14 years ago|reply
"I see poor understanding of the performance ranges of various components"

I'm totally guilty of this. I write new code on new hardware, and have very little intuitive of how fast it should go. Is 10k ops a second good? 1M? I just don't know how fast it should go. Of course, then I pull out the profiler, and think about my algorithm, but it takes a lot of second-guessing to decide how close to the limit I am.

For example, I was writing some clojure code to write to a SQL database. I'm relatively new to the JVM stack. I was writing to the DB at 1MB/s. I thought "well, that's not great, but not bad. Maybe after network traffic and DB constraints, and writing to a laptop disk drive, I suppose that's alright". No, I replace the JDBC DB thread connection pooling driver, and the same code now writes at 8 MB/s.

It'd be nice if there were a web resource for general guidelines on what it takes to max out hardware. Basically, benchmarks for real-world tasks.

[+] thirdstation|14 years ago|reply
"It'd be nice if there were a web resource for general guidelines on what it takes to max out hardware. Basically, benchmarks for real-world tasks."

Here here!!

I had the same thought when reading the first two pages of the article. I'd love to be able to better intuit performance (or heck, troubleshoot slow systems - which I do more often). The problems I encounter are lack of accurate and understandable information about the underlying hardware and the various layers between my program and the hardware (especially important for me lately as more of my stuff runs in a VM).

It seems like you need to be lucky and find a mentor willing to teach this esoteric material.

[+] KirinDave|14 years ago|reply
It's unfortunate the folks who edited this couldn't differentiate the real aces from the people pining for the days of the waterfall model.
[+] nadam|14 years ago|reply
I love speed optimization over what I am mostly payed for at my workplace (usual business applications). I am also better in what I love. Unfortunatelly no one really wants to pay me for optimizing the hell out of an algorithm. They want me to maintain their boring Java enterprise applications. I've searched for such tasks here at Hacker's News also, no one was interested. Not a single company. So I don't think it is a skill which is really in demand today.
[+] iam|14 years ago|reply
At most jobs I've been at so far, "optimizing code" involved digging through the entire codebase with profilers and instrumentation for weeks at a time trying to find performance bottlenecks. Then maybe if you were very lucky you got to fix it, but more often than not it was an architectural bottleneck and you'd be SOL, best case is that a domain expert would get to fix it.

Then again this is dealing with system-wide performance. Application-specific performance should be significantly less of a problem to fix.

I think the jobs that are simply "here is this function, make it 10x faster" would be pretty rare, since usually people don't know what part of the code is going slow. A lot of the times they'll guess "X, Y, Z is making it slow" but without a real performance analysis patching stuff all over the place just doesn't pan out.

[+] mattmanser|14 years ago|reply
Try embedded systems programming, speed is usually pretty crucial there.

I've met a guy who runs a company of about 6 programmers doing this, has more work than he can handle and has difficulty finding good enough programmers. I think they're mainly C++ but were recently trying to find a C# guy.

So it's in demand, but you've got to know where to look.

[+] jswinghammer|14 years ago|reply
I would be happy if more than 1/10 interview candidates managed to pass my most basic programming questions. I consider my standards to be too low for the type of jobs I interview people for and I'm still disappointed routinely.
[+] kahawe|14 years ago|reply
How basic are those questions if you don't mind me asking for details?
[+] ominous_prime|14 years ago|reply
This still comes back to the basic guideline; "learn your fundamentals". You don't need to learn them all (designing CPU instruction sets??), but you do need to know the primary layers you interact with. A web programmer generally doesn't need to be aware of the machine code generated by his application, but should be knowledgeable in networking (layer 3 and up), caching, databases, and so on.
[+] StavrosK|14 years ago|reply
I'm a web programmer, and, although it wasn't necessary per se, designing and implementing a CPU was one of the most fun things I ever did.
[+] ff0066mote|14 years ago|reply
I was caught off guard by the first line, where the author uses PHP as one of the examples of an environment out-of-touch with hardware issues.

I started to learn how to program in PHP. Back then there was a sentiment that PHP and similar high level scripting languages weren't real programming.

With the web so ubiquitous today, I didn't that sentiment had survived, but here it is.

[+] ominous_prime|14 years ago|reply
Simply because you may be an exception, it doesn't disprove the general case. Many self taught programmers (which php brought a rush of) lack programming fundamentals. They are blissfully unaware of the implications of the Von Neumann architecture, and don't understand how their actions translate down the various layers that support their code. If this was written 10 years ago, the author would have likely cited Perl, or VB.
[+] mvanga|14 years ago|reply
I agree with some of the points the author was trying to make although I don't think he really expressed them with the right arguments.

I think our culture was and will always be based on exploration and innovation and this is simply moving to higher levels of abstractions today. There is nothing wrong with this.

However, I personally am not satisfied with simply being able to use an abstracted interface. I have a strong curiosity of how things work under the hood, of tinkering with something to make it do new things and even try and rewrite things in simpler forms.

I think a different kind of hacker evolves when you have a basic understanding of the entire technology stack. This breed is inevitably going to fade with the increasing complexity of this entire stack (breadth and height) paired with the current speed of innovation.

In the end, we can lament all that we are losing or work towards everything that lies ahead unexplored :)

[+] numeromancer|14 years ago|reply
People like this author and the people whose complaints he propagated convinced Socrates that the Oracle was right--because he was the only person who was aware of his own ignorance.
[+] alexk7|14 years ago|reply
I disagree. If you are able to achieve your goals without going low-level, please do it. If you need to understand low-level stuff to get the job done, please start learning. But please don't fall for the "every programmer should know this" meme.
[+] MaxPresman|14 years ago|reply
Feels like the author of this article (and the people he quotes) are super bitter about the "new generation". Perhaps things did not work quite well for them, but there is no need to blame it on the kids : /
[+] grantismo|14 years ago|reply
Today's hardware isn't as much of a bottleneck as it used to be. That's the simplest explanation for fewer programmers with low-level knowledge.