Is there a book somewhere that tries to set out all of the things that experts know about computing that they don't remember learning?
(In Zed Shaw's conception, this might correspond to "learn computing the hard way".)
I see his examples and other examples here in this discussion, and it makes me wonder about the value (or existence) of a very thorough reference.
I've also encountered this when working with lawyers who wanted to have a reference to cite to courts about very basic facts about computing and the Internet. In some cases, when we looked at the specifications for particular technologies or protocols, they didn't actually assert the facts that the lawyers wanted to cite to, because the authors thought they were obvious. I remember this happening with the BitTorrent spec, for example -- there was something or other that a lawyer wanted to claim about BitTorrent, and Bram didn't specifically say it was true in the BitTorrent spec because no BitTorrent implementer would have had any doubt or confusion about it. It would have been taken for granted by everyone. But the result is that you couldn't say "the BitTorrent spec says that" this is true.
Another example might be "if a field is included in a protocol and neither that layer nor a lower layer is encrypted with a key that you don't know, you can see the contents of the field by sniffing packets on the network segment". It might be challenging to find a citation for this claim!
So we could also wish for a "all our tacit knowledge about computing, programming, and computer networking, made explicit" kind of reference. (I'm not sure what kind of structure for this would be most helpful pedagogically.)
I would find some resource of assumed knowledge useful in my career field of chemistry and biology, but what really resonated with me from zedshaw's piece was something even more basic to the field. The biggest hurdle I had to learning to code was finding a text editor. The next big hurdle I had was finding somewhere to host my one-page HTML site. I ended up asking a friend who suggested Sublime Text and Amazon AWS. At that point I'd been reading HN for two years and had heard of these things, but not understood their central utility (or the utility of similar such general services) to doing anything with code. This is the level of beginner that I would hope zedshaw's efforts would target, someone like me from last year.
I want to emphasize that while learning abstract functions, and understanding that code syntax is an abstraction for electrons moving through logic gates are fundamental concepts for an early programmer, learning those concepts was less frustrating for me than finding somewhere to write the text I had learned from codecademy. I am a chemist by formal training, I took an extremely abstract multivariable calc course in college that taught me Big and Little O, and functionalized concepts most people learn by rote "drills," and I consider learning new concepts my strongest career skill. I don't mean to humblebrag here, but rather to refute the only-somewhat-popular sentiment I've seen on HN that non-coders "can't get shit done." No. I am a non-coder that does shit. In moving from chemistry to neurobiology and biophysics, there are basic skills that can't be found in a textbook, like _this is a pipette_, and _this is a flask to grow cells_, and if you don't know those things you won't be able to do experiments, and you'll fail the informal subtext of an interview. The best resource I've found in (three years of reading HN anonymously) for analogous tool-teaching in code has been Michael Hartl's book on learning Rails, so thanks again mhartl! The first two chapters of that resource were more treacherous (but ultimately well-guided and successful) than teaching myself d3.js. A true, zero-level, adult beginner's guide to some code---manipulating an excel sheet in Python, writing an API-interacting tweet bot---would be a great boon to people like me.
Charles Petzold's book 'Code' has a lot of this very basic, low-level information. It basically builds up from basic information theory to computers. It's not going to have everything you're looking for but I was surprised how much of it I "knew" without recalling where I learned it or how it connected to other things.
> Another example might be "if a field is included in a protocol and neither that layer nor a lower layer is encrypted with a key that you don't know, you can see the contents of the field by sniffing packets on the network segment". It might be challenging to find a citation for this claim!
The term of art here (as used in patent law, for instance) would be "person having ordinary skill in the art". Something like that (if you don't encrypt something in a network protocol, it can be sniffed) is "obvious to a person having ordinary skill in the art".
But yeah, that does make it difficult to cite sources for them. And in particular, it's difficult to throw the book at someone who doesn't take those things as obvious when there's no such book.
Another "obvious" thing that I don't know a citation for: it's impossible to store a piece of information in a binary such that it cannot be read by a person who has a copy of that binary. (Practical corollary: there's no such thing as a piece of information that's too secret to include in an Open Source driver, but that can be included in a binary driver.)
I don't know of such a book that's targeted to experts, but you can get a good idea of those things by skimming the first 2 or 3 chapters of Zed's [Learn Python the Hard Way](http://learnpythonthehardway.org/book/ex0.html), and Appendix A.
I was recently working on a case and I need a definition of software library. Couldn't find one anywhere. So many of the basics of programming are undefined it makes arguing about programming incredibly difficult -- see every discussion of what a functional language is or strong vs weak typing.
> things that experts know about computing that they don't remember learning?
After a certain point, you can learn a lot about something by teaching it, because doing so forces you to re-evaluate things which you've internalized and forgotten.
This is good. I've had problems that were somewhat related to what the author talks about.
When I was learning C# and was already quite fluent in C/C++. I had a big problem with the C# type system/management. I'd been reading guides that were in the first category the author mentions, eg. "not really a beginner, but new to this language".
I was trying to retrieve the bytes that a certain string represented. I was looking for ages and everywhere everyone mentioned that "this shouldn't be done", "just use the string", etc. A stack overflow answer mentions a way to use an 'encoding' to get the bytes and this seemed to be the only way.
How strange I thought, I just want access to a pointer to that value, why do I have to jump through all these hoops. None of the guides I was reading provided an answer, until I found a _real_ beginners book. This book, helpfully starting at the real beginning of every language: the type system, finally gave me the answer I was looking for:
.net stores/handles all strings by encoding them with a default encoding. It turned out that the whole notion of 'strings are only bytes' that I carried over from C++ does not work in C#. All those other helpful guides gleefully glossed over this, and started right in at lambdas and integration with various core libraries. Instead of focusing at the basics first.
This has nothing to do with learning a programming language and everything to do with learning how to process text in a computer. Being a C programmer doesn't mean you have only a PDP-11's understanding of text ("it can be ASCII or EBCDIC, and I know how to convert between the two!").
When I learned C# (in 2003?), I learned that String in an array of Char and Char is 16 bit, and that .NET used the same encoding as Windows NT (UTF-16 in native endian).
I knew that both WinNT and Java made the mistake of being designed at a time when people assumed 16 bits are enough and consequently caused the surrogate pairs mess. I knew that Java assumes UTF-16BE and Windows assumes UTF-16LE. I knew what UTF-16 means in C/C++ and how to work with it or transform such data to and from UTF-8 and UCS-4.
When learning a new programming language, I know to look up whether strings are mutable and whether they're sequences of bytes, code units or code points. If they's immutable, I look up how to create a copy instead of a reference when using substring and when they're not bytearrays I look up how real bytearrays are called in this language.
Should early programmers be taught this? Absolutely. At what stage? I don't know. But they must be taught from the start that this has nothing to do with a programming language and everything to do with how data is represented in memory.
I'd actually consider that kind of knowledge pretty advanced. Beginners (and early up to even junior coders) usually don't know much about the internals of their environment; they just use stuff.
I'm always interested in the internals; but it's often surprisingly hard to find information on the internals. There are few books, and you'll often need to read lots of source code and specifications and reverse engineer things to find out how stuff works under the hood.
That's only "basics" if you've got the wrong idea. There are millions of possible mistakes, no beginners' guide can explicitly address every one. People told you to just use the string - wasn't that a good enough answer?
I've been teaching coding to beginners for the past year now...and even after having done coding workshops/tutorials for many years previous, I've found I can never overestimate how wide the knowledge gap is for new coders.
Yesterday I was talking to a student who had taken the university's first-year CS course, which is in Java...she complained about how missing just one punctuation mark meant the whole program would fail...While I can't passionately advocate for the use of Java in first-year courses (too much boilerplate, and the OOP part is generally just hand-waved-away)...I've realized that the exactness of code must be emphasized to beginners. And not just as something to live with, but something to (eventually) cherish (for intermediate coders, this manifests itself in the realization that dynamic languages pay a price for their flexibility over statically-typed languages).
Is it a pain in the ass that missing a closing quotation mark will cause your program to outright crash, at best, or silently and inexplicably carry on, at worst? Sure. But it's not illogical. Computers are dumb. The explicitness of code is the compromise we humans make to translate our intellectual desire to deterministic, wide-scale operations. It cannot be overemphasized how dumb computers are, especially if you're going to be dealing with them at the programmatic level...and this is an inextricable facet of working with them. It's also an advantage...predictable and deterministic is better than fuzziness, when it comes down to doing things exactly right, in an automated fashion.
I think grokking the exactness of code will provide insight to the human condition. While using the wrong word in a program will cause it to fail...we perceive human communication as being much more forgiving with not-quite-right phrasing and word choices? But is that true? How do you know, really? How many times have you done something, like forget to say "Please", and the other person silently regards you as an asshole...and your perception is that the transaction went just fine? Or what if you say the right thing but your body (or attire) says another? Fuzziness in human communication is fun and exciting, but I wouldn't say that it's ultimately more forgiving than human-to-computer communication. At least with the latter, you have a chance to audit it at the most granular level...and this ability to debug is also inherent to the practice of coding, and a direct consequence of the structure of programming languages.
A good analogy I've heard to explain this is how you'd request a glass of water from the kitchen from a friend versus a computer. You can simply tell your friend "get me a glass of water" and they'll understand what you're asking. With a computer though, you must be completely explicit with your instructions, for example: walk to the kitchen, open the top left cabinet, take out a glass, put it underneath the faucet, turn the faucet on until the glass is 80% full... etc.
Thinking about the OA, your student, and the people who don't know how to find | character and about what people tinker with now.
OA refers to more experienced programmers forgetting that they typed programs from computer magazines and soaked up the basics that way - that is a specific point in history wasn't it, the early 1980s with BASIC listings to type and try to save on the cassette. I'm older and did BASIC exercises line by line on a teletype connected via an accoustic coupler and a modem (19" rack with dial on the front) to a remote mainframe. And yes, we got line noise sometimes. And we learned about the need for exactness.
Have we reached a point in history where the machines are so shiny there is no way in? Should we give people recycled laptops with a command line linux install and suggest that they have to assemble their own UI out of bits and pieces of old school window managers? A prize for the most way out desktop?
"Is it a pain in the ass that missing a closing quotation mark will cause your program to outright crash, at best, or silently and inexplicably carry on, at worst? Sure. But it's not illogical. Computers are dumb. The explicitness of code is the compromise we humans make to translate our intellectual desire to deterministic, wide-scale operations. It cannot be overemphasized how dumb computers are, especially if you're going to be dealing with them at the programmatic level...and this is an inextricable facet of working with them. It's also an advantage...predictable and deterministic is better than fuzziness, when it comes down to doing things exactly right, in an automated fashion."
And yet so much code has so many bugs, defects and errors in it. If writing code was as mechanical and deterministic as you think it is (and should be?), then why does so much production code suck?
This is pretty ridiculous, man. I don't think I know a beginner programmer who would be so stuck on "every character matters." (Which isn't even true, to some level, in many langauges - ; in JavaScript and Python? Whitespace in languages besides Python?)
The way I would explain it is to have them take a imagine writing a code tokenizer and interpreter of a simple language themselves. That's what the intro CS class I took at Berkeley, 61A, had us code with a subset of Lisp, with a lot of help, of course. I don't think we needed to know how to use anything but strings, functions, and arrays, although it did involve recursion. This problem will never be broached again once they realize there's code reading their code. Of course it's arbitrary.
I actually worked on teaching my 71 year old father Python using this book. One point of difficulty that struck me during that exercise was that I as a programmer had completely internalized the idea that an open paren and a close paren right after a function is a natural way to invoke a function with zero arguments (e.g.: exit() exits Python's prompt. exit doesn't.). The whiplash I felt from finding the questioning of the convention silly to finding the convention silly was amusing to feel. Like it makes sense to a parser but not to a flesh-and-blood contextual-clues-using human. We don't vocalize "open paren close paren" whenever we say an intransitive verb. We just "know" that it's intransitive. Anyway, great article.
It is a silly convention, really. Algol-60 didn't require them, and neither did any language in the Algol family (Pascal, Ada, Modula-2, etc). It used to be something peculiar to Fortran and C, but today every language imitates C...
But there's a good reason for that convention in any language with first class functions. Otherwise you would have something inconsistent like "no parens required if the call is on a line by itself, but they are required in any other expression" (i.e. assignment or another function call).
I can still visualize what it's like to know nothing, because when I saw a BASIC program for the first time when I was ten, I thought the = signs denoted mathematical equality (equations). How the heck can X be equal to Y + 1, if in the next line, Y is equal to X - 2?
Later, I tried using high values for line numbers just for the heck of it. Can I make a BASIC program that begins at line 100,000 instead of 10? By binary search (of course, not knowing such a word) I found that the highest line number I could use was 65,000 + something. I developed the misconception that this must somehow be because the computer has 64 kilobytes of memory.
The only important trait I see that matters for either of these groups is a willingness to try things, push buttons, see what happens.
A beginner worries about breaking the computer and doesn't yet understand that any question they have can be typed into a search engine verbatim and will probably be answered with 20 SO posts and 50 blogs posts. And early programmer is stumbling down this road.
I don't know that this ethos can be communicated with a book.
I would also recommend that beginners/early programmers learn 1 programming language really well, and ignore the din of people on the internet who claim to effortlessly, expertly jump among 10 languages as part of their day-to-day.
> I would also recommend that beginners/early programmers learn 1 programming language really well
That's a dangerous approach. The first language is very hard to learn, because, well, it's your first. And when you stick to one language, you easily conflate the syntax and the semantics.
So when you learn a second language, you have to unlearn the syntax of the first, in addition to learn the genuinely different concepts. Distinguishing the similar stuff in new clothes from the actual new stuff is hard. Simply put, learning the second languages will be very hard as well.
Now your programmer has two data point, and knows with their gut that learning a new programming language is hard. This sets expectation, an will make it harder to learn additional languages. It will take some time to realise learning a new language, besides a few insane exceptions like C++, is not that hard.
> ignore the din of people on the internet who claim to effortlessly, expertly jump among 10 languages as part of their day-to-day.
Jumping from language to language may not be that easy. But one can certainly be an expert at 20 programming languages. Once you see the commonalities, there isn't much to learn. Really, a good course in programming languages is enough to get you started. The hard part is memorising 20 big programming frameworks, with all their warts, special cases and so on. Still, if you know the concepts, learning the vocabulary takes little time.
Just a side note to your "will probably be answered with 20 SO posts":
I am currently going through Learn Python the Hard Way and at the end of one exercise I was doing the extra credit stuff (research online all python formatting characters) and typed in the question verbatim, one of the top answers was from SO, went there to check it out and one of the first answers was:
"So you are going through Learn python the Hard Way and are to lazy to find the answer"
I mean, I understand some of what that person was trying to say, there is some basic etiquette you should follow when asking questions online, but I am very sure a beginner would not know it. And the question wasn't even "Tell me all the python formatting characters" it was asking where to find a list of them. Another answer did point them to the python docs, but I felt that whoever asked the question would be hesitant about using SO again when they have another problem, which is a shame.
and ignore the din of people on the internet who claim to effortlessly, expertly jump among 10 languages as part of their day-to-day.
The problem with this isn't whether or not that claim is true, but rather that (as you say) its not the right thing for a beginner. Its perfectly fine (desirable, even, IMHO) for an intermediate-to-expert programmer to know many (different) languages and use a few of them regularly (although perhaps outside of JS and whatever you use on the backend, probably daily is a bit much maybe), but this is all distraction for a beginner. Beginners should focus on learning programming concepts, not on learning programming languages (but one language is a requirement, otherwise you can't learn by doing!) Although, Zeds point about learning the basics of four languages still holds - you can do that and then focus on one.
I disagree with this entirely. You can "push buttons and see what happens" for decades without guidance on what buttons you're pushing and why. In addition, newcomers absolutely cannot just "type [their question] into a search engine verbatim" -- this is actually a highly advanced skill that you get after years of learning the correct patterns, abstractions, and jargon you need to get an effective answer for what you're looking for.
For example, a beginner might phrase queries like "jquery how do I make text appear on screen", whereas a more experienced person might query something like "jquery element insert value".
Have you read Zed's books? I think this "ethos" is captured pretty nicely in them. Most chapters end asking the reader to solve a few challenges, some of these are difficult for a beginner, and it is recommended that you spend some time researching the answer.
Zed Shaw is a natural when it comes to teaching beginners. I recommend his "Learn The Hard Way" books to everyone who is interested in learning to code because they make zero assumptions and start at the VERY beginning. It's stupidly hard to find great books for complete noobs.
I'm totally behind this distinction, and I hope more content publishers adopt something like this.
I was bitten by this as well, I thought the book was for an "early programmer" not a total beginner.
Hindsight and all, it seems the book would have better titled "Learn to Program the Hard Way (using Python)". Or "Learn to Program the Hard Way (using Ruby)". A total beginner is really trying to learn how to build a program, not trying to learn a particular language (whether they know that or not).
I think it took me three years to understand what a variable was.
And I still don't know why it took me so long to understand and why I suddenly understood it.
It's not that I didn't know that assigning '1' to 'a' would result in 'a' having a value of '1', but I didn't understand the concept and workings behind it. I just thought it was magic.
IMHO, Zed is right. I have been looking for books targeted to beginner programmers so I could recommend them to my friends, but most books unfortunately fail on this point.
A Notable exception I found is "Learn you a Haskell for Great Good!". It is as good for beginning coders as it is for early (or advanced) ones.
The author made the effort to describe some relatively basic things, and it was simple enough (okay, with a few calls to me here and there) for an Art major friend of mine to start with programming, and with Haskell. I can't recommend this book enough.
I feel like I'm perpetually stuck between what the author describes as "beginner" and "early". I understand what programming is, I can write a bash script that does what I want it to (granted, I have to read a ton of man pages to make sure I understand what it is I want to accomplish), I can write simple programs in Visual Basic or Python or Javascript that do simple tasks. I understand program flow, logic, and all the basics of high-school level algebra.
The problem is, I can't wrap my head around many of the concepts I read about here in the HN comments and elsewhere on programming blogs and such. No matter how much I try to understand it (and by understand it, I mean fully grasp what the person is talking about without having to look up every other word or phrase), I can't seem to put it all together. Things like inverted trees, functional programming (I've heard of Haskell and I'd love to learn it, but I have no head for mathematics at that level), polymorphism, and so on.
Maybe I need to just practice more; maybe I need to pick something interesting from Github and dive into the code to try to understand it better (preferably something well documented of course). Or maybe I need to just stop, and accept that I can whip out a script or simple web thingy if I really need to, and stick to being a hardware guy, which I'm actually good at.
I have been thinking this for years.... though I would consider myself an "early coder" according to the article.
This stuck out to me as being just the beginnings of the quintessential issue:
A beginner’s hurdle is training their brain to grasp the concrete problem of using syntax to create computation and then understanding that the syntax is just a proxy for how computation works. The early coder is past this, but now has to work up the abstraction stack to convert ideas and fuzzy descriptions into concrete solutions. It’s this traversing of abstraction and concrete implementation that I believe takes someone past the early stage and into the junior programmer world.
But why stop at just "beginner" "early" and "advanced". All of the books I have on programming are either truly "beginner" or blankly labeled as a programming guide, when in actuality it is quite "advanced"...nothing in between.
If, as the article states, 4 is the magic number of languages to learn up front, perhaps there should be a 4th level of programming guides....one for the journeyman who knows the syntax, can articulate the complex algorithmic issues that need to be addressed, but isn't quite at that "mastery" or "advanced" level.
I think that's the stage I'm at right now (or getting there). But what is the divide between "junior" and "senior/advanced" programmers? And what would help somebody (me) push across that boundary?
Programming is a frustrating job, you're pretty much doomed to be a beginner forever. It's part of what makes it exciting day in and day out, but it can also be overwhelming.
No, there's definitely an underlying substrate of significant commonality between the various programming languages and technologies. If you're at 10 years in and you still feel like a beginner, you're doing something wrong.
Obviously I can't expect to pick up a brand new technology and instantly expect to be a wizard, but I do expect that I can pick up a new technology and be functioning at a high level in a week or two, tops, because it's almost certainly just a respelling/reskinning of some technology I've used before.
(The whole "young guys who know way more than their old-fogey elders" was, in my opinion, an isolated one-time event when we transition from mainframe tech to desktop tech. Despite its recurrence on HN, I think "age discrimination" is naturally receding and will just naturally go away as the people on this side of that transition continue to age, and skill up.)
I think the point of the article is that you are not a beginner forever. You are a beginner when you know absolutely nothing about programming. After years of experience, no one should be an actual "beginner", by the authors standards, because you understand programming conceptually. Being a programmer can be frustrating for a lot of reasons, but not understanding the basics of coding should probably not be one.
Last fall I went through a coding bootcamp in Toronto. It was 9 weeks of hard work sprinkled with lots of frustration and lots of feel good successes.
A main takeaways I had was everyone comes in with a different background and everyone has a unique approach to learning.
The problem expressed in this article is a fundamental bottleneck of education. The communication between teacher and student is often misinterpreted at both ends and the subject matter is never perfectly conveyed or received.
I feel what really lacks in the learn to code community is teaching one how to actually learn.
Lay a positive attitude towards failure and a framework of problem solving first, the content and understanding of a language will come after.
No. Just change the name of the book to "Learn Programming The Hard Way (Python Edition)". By putting the language in the title it sounds like it is for an experience programmer learning a new language, not for learning how to program.
This is a fantastic article and is another great example to pile on as to why Zed Shaw is the king of programming teaching.
One area I struggle with in tutoring is how to inspire/invoke/detect disciplined motivation. What I mean is, whenever I sit down to show someone something, I'm constantly questioning myself "wait, do they actually want to learn this level of detail, or am I just giving too much information that's going in one ear and out the other?" If someone is definitely motivated to learn that's great (and really inspiring for me as a teacher to do better at explaining things precisely).
If this nomenclature were more understood, I would like to say something like "Sorry, what you're trying to do is more of an early/junior task, and right now you need to stick with the Beginning basics". I just don't know how to phrase that without sounding condescending.
Zed points to a very real problem - it's easy for us to forget what we know. But there's another problem with targeting beginners. They are all over the place in terms of experience.
Computing is so tightly woven into our world not that it's hard to find people who have more than a passing interest in it who have not find some way to try to code as kids. Even with those who haven't, there's a gulf between people who have tried to do a little HTML editing (and know what a file is) and people who haven't. There's no one place to start. From Zed's description it looks like he's starting from the lowest possible point, but what are the demographics like there? How many people are in that space and are they mostly adults or children?
I think this is one of the main reasons why you don't see much beginner's material.
"My favorite is how they think you should teach programming without teaching “coding”, as if that’s how they learned it."
I often wonder about this. In the UK, with the drive to get every child 'coding', there are a large number of teachers that constantly talk about how the main skill that we should be teaching is 'Computational Thinking'.
I wax and wane back and forth over this topic, in a very chicken and egg way. However, I usually end up coming to the conclusion that learning computational thinking is great, but you need to know how to code (i.e. learn the basic syntax of a language) before you can possibly learn how to think computationally.
I would be very interested to hear the opinions of actual developers, as to their opinions on the topic.
Part of the problem is that there are too many things each language can now do. Every single language wants feature parity with every other language. Every single language wants to do everything.
This means, an expert in one language is going to be "Beginner" instead of "Early" in some some ways...but "Early" instead of "Beginner" in other ways.
Anecdotally, as a software engineer working with C++, I had to spend a whole months trying to understand event-driven programming of other languages. I didn't really need tutorials on loops and recursions but I sure as hell needed to understand how a typical program in that language works.
I am a instructor for Software Carpentry[1] , the goal of these workshops from my experience is to try and help mostly scientists get started on the journey to becoming early programmers.
In biological sciences with more and more data becoming available, the Expert blindness Zed speaks of is a major problem. We need to invent better systems and actually take heed of research based teaching methods as SW does if we wish to improve this situation.
[+] [-] schoen|10 years ago|reply
(In Zed Shaw's conception, this might correspond to "learn computing the hard way".)
I see his examples and other examples here in this discussion, and it makes me wonder about the value (or existence) of a very thorough reference.
I've also encountered this when working with lawyers who wanted to have a reference to cite to courts about very basic facts about computing and the Internet. In some cases, when we looked at the specifications for particular technologies or protocols, they didn't actually assert the facts that the lawyers wanted to cite to, because the authors thought they were obvious. I remember this happening with the BitTorrent spec, for example -- there was something or other that a lawyer wanted to claim about BitTorrent, and Bram didn't specifically say it was true in the BitTorrent spec because no BitTorrent implementer would have had any doubt or confusion about it. It would have been taken for granted by everyone. But the result is that you couldn't say "the BitTorrent spec says that" this is true.
Another example might be "if a field is included in a protocol and neither that layer nor a lower layer is encrypted with a key that you don't know, you can see the contents of the field by sniffing packets on the network segment". It might be challenging to find a citation for this claim!
So we could also wish for a "all our tacit knowledge about computing, programming, and computer networking, made explicit" kind of reference. (I'm not sure what kind of structure for this would be most helpful pedagogically.)
[+] [-] totalBeginner|10 years ago|reply
I want to emphasize that while learning abstract functions, and understanding that code syntax is an abstraction for electrons moving through logic gates are fundamental concepts for an early programmer, learning those concepts was less frustrating for me than finding somewhere to write the text I had learned from codecademy. I am a chemist by formal training, I took an extremely abstract multivariable calc course in college that taught me Big and Little O, and functionalized concepts most people learn by rote "drills," and I consider learning new concepts my strongest career skill. I don't mean to humblebrag here, but rather to refute the only-somewhat-popular sentiment I've seen on HN that non-coders "can't get shit done." No. I am a non-coder that does shit. In moving from chemistry to neurobiology and biophysics, there are basic skills that can't be found in a textbook, like _this is a pipette_, and _this is a flask to grow cells_, and if you don't know those things you won't be able to do experiments, and you'll fail the informal subtext of an interview. The best resource I've found in (three years of reading HN anonymously) for analogous tool-teaching in code has been Michael Hartl's book on learning Rails, so thanks again mhartl! The first two chapters of that resource were more treacherous (but ultimately well-guided and successful) than teaching myself d3.js. A true, zero-level, adult beginner's guide to some code---manipulating an excel sheet in Python, writing an API-interacting tweet bot---would be a great boon to people like me.
[+] [-] MattGrommes|10 years ago|reply
[+] [-] JoshTriplett|10 years ago|reply
The term of art here (as used in patent law, for instance) would be "person having ordinary skill in the art". Something like that (if you don't encrypt something in a network protocol, it can be sniffed) is "obvious to a person having ordinary skill in the art".
But yeah, that does make it difficult to cite sources for them. And in particular, it's difficult to throw the book at someone who doesn't take those things as obvious when there's no such book.
Another "obvious" thing that I don't know a citation for: it's impossible to store a piece of information in a binary such that it cannot be read by a person who has a copy of that binary. (Practical corollary: there's no such thing as a piece of information that's too secret to include in an Open Source driver, but that can be included in a binary driver.)
[+] [-] egypturnash|10 years ago|reply
[+] [-] resc1440|10 years ago|reply
[+] [-] FigBug|10 years ago|reply
[+] [-] Terr_|10 years ago|reply
After a certain point, you can learn a lot about something by teaching it, because doing so forces you to re-evaluate things which you've internalized and forgotten.
[+] [-] SCHiM|10 years ago|reply
When I was learning C# and was already quite fluent in C/C++. I had a big problem with the C# type system/management. I'd been reading guides that were in the first category the author mentions, eg. "not really a beginner, but new to this language".
I was trying to retrieve the bytes that a certain string represented. I was looking for ages and everywhere everyone mentioned that "this shouldn't be done", "just use the string", etc. A stack overflow answer mentions a way to use an 'encoding' to get the bytes and this seemed to be the only way.
How strange I thought, I just want access to a pointer to that value, why do I have to jump through all these hoops. None of the guides I was reading provided an answer, until I found a _real_ beginners book. This book, helpfully starting at the real beginning of every language: the type system, finally gave me the answer I was looking for:
.net stores/handles all strings by encoding them with a default encoding. It turned out that the whole notion of 'strings are only bytes' that I carried over from C++ does not work in C#. All those other helpful guides gleefully glossed over this, and started right in at lambdas and integration with various core libraries. Instead of focusing at the basics first.
[+] [-] wolf550e|10 years ago|reply
When I learned C# (in 2003?), I learned that String in an array of Char and Char is 16 bit, and that .NET used the same encoding as Windows NT (UTF-16 in native endian).
I knew that both WinNT and Java made the mistake of being designed at a time when people assumed 16 bits are enough and consequently caused the surrogate pairs mess. I knew that Java assumes UTF-16BE and Windows assumes UTF-16LE. I knew what UTF-16 means in C/C++ and how to work with it or transform such data to and from UTF-8 and UCS-4.
When learning a new programming language, I know to look up whether strings are mutable and whether they're sequences of bytes, code units or code points. If they's immutable, I look up how to create a copy instead of a reference when using substring and when they're not bytearrays I look up how real bytearrays are called in this language.
Should early programmers be taught this? Absolutely. At what stage? I don't know. But they must be taught from the start that this has nothing to do with a programming language and everything to do with how data is represented in memory.
[+] [-] jakobegger|10 years ago|reply
I'm always interested in the internals; but it's often surprisingly hard to find information on the internals. There are few books, and you'll often need to read lots of source code and specifications and reverse engineer things to find out how stuff works under the hood.
[+] [-] bkmartin|10 years ago|reply
[+] [-] lmm|10 years ago|reply
[+] [-] danso|10 years ago|reply
Yesterday I was talking to a student who had taken the university's first-year CS course, which is in Java...she complained about how missing just one punctuation mark meant the whole program would fail...While I can't passionately advocate for the use of Java in first-year courses (too much boilerplate, and the OOP part is generally just hand-waved-away)...I've realized that the exactness of code must be emphasized to beginners. And not just as something to live with, but something to (eventually) cherish (for intermediate coders, this manifests itself in the realization that dynamic languages pay a price for their flexibility over statically-typed languages).
Is it a pain in the ass that missing a closing quotation mark will cause your program to outright crash, at best, or silently and inexplicably carry on, at worst? Sure. But it's not illogical. Computers are dumb. The explicitness of code is the compromise we humans make to translate our intellectual desire to deterministic, wide-scale operations. It cannot be overemphasized how dumb computers are, especially if you're going to be dealing with them at the programmatic level...and this is an inextricable facet of working with them. It's also an advantage...predictable and deterministic is better than fuzziness, when it comes down to doing things exactly right, in an automated fashion.
I think grokking the exactness of code will provide insight to the human condition. While using the wrong word in a program will cause it to fail...we perceive human communication as being much more forgiving with not-quite-right phrasing and word choices? But is that true? How do you know, really? How many times have you done something, like forget to say "Please", and the other person silently regards you as an asshole...and your perception is that the transaction went just fine? Or what if you say the right thing but your body (or attire) says another? Fuzziness in human communication is fun and exciting, but I wouldn't say that it's ultimately more forgiving than human-to-computer communication. At least with the latter, you have a chance to audit it at the most granular level...and this ability to debug is also inherent to the practice of coding, and a direct consequence of the structure of programming languages.
[+] [-] alphast0rm|10 years ago|reply
[+] [-] keithpeter|10 years ago|reply
OA refers to more experienced programmers forgetting that they typed programs from computer magazines and soaked up the basics that way - that is a specific point in history wasn't it, the early 1980s with BASIC listings to type and try to save on the cassette. I'm older and did BASIC exercises line by line on a teletype connected via an accoustic coupler and a modem (19" rack with dial on the front) to a remote mainframe. And yes, we got line noise sometimes. And we learned about the need for exactness.
Have we reached a point in history where the machines are so shiny there is no way in? Should we give people recycled laptops with a command line linux install and suggest that they have to assemble their own UI out of bits and pieces of old school window managers? A prize for the most way out desktop?
[+] [-] ZanyProgrammer|10 years ago|reply
And yet so much code has so many bugs, defects and errors in it. If writing code was as mechanical and deterministic as you think it is (and should be?), then why does so much production code suck?
[+] [-] fredgrott|10 years ago|reply
[+] [-] bwy|10 years ago|reply
The way I would explain it is to have them take a imagine writing a code tokenizer and interpreter of a simple language themselves. That's what the intro CS class I took at Berkeley, 61A, had us code with a subset of Lisp, with a lot of help, of course. I don't think we needed to know how to use anything but strings, functions, and arrays, although it did involve recursion. This problem will never be broached again once they realize there's code reading their code. Of course it's arbitrary.
(project, in case you're curious: http://www-inst.eecs.berkeley.edu/~cs61a/fa14/proj/scheme/)
[+] [-] top1nice1gtsrtd|10 years ago|reply
[+] [-] jackmaney|10 years ago|reply
The code:
yields the output:[+] [-] reagency|10 years ago|reply
[+] [-] pdw|10 years ago|reply
[+] [-] Tenobrus|10 years ago|reply
[+] [-] reagency|10 years ago|reply
[deleted]
[+] [-] kazinator|10 years ago|reply
Later, I tried using high values for line numbers just for the heck of it. Can I make a BASIC program that begins at line 100,000 instead of 10? By binary search (of course, not knowing such a word) I found that the highest line number I could use was 65,000 + something. I developed the misconception that this must somehow be because the computer has 64 kilobytes of memory.
[+] [-] jordanpg|10 years ago|reply
A beginner worries about breaking the computer and doesn't yet understand that any question they have can be typed into a search engine verbatim and will probably be answered with 20 SO posts and 50 blogs posts. And early programmer is stumbling down this road.
I don't know that this ethos can be communicated with a book.
I would also recommend that beginners/early programmers learn 1 programming language really well, and ignore the din of people on the internet who claim to effortlessly, expertly jump among 10 languages as part of their day-to-day.
[+] [-] loup-vaillant|10 years ago|reply
That's a dangerous approach. The first language is very hard to learn, because, well, it's your first. And when you stick to one language, you easily conflate the syntax and the semantics.
So when you learn a second language, you have to unlearn the syntax of the first, in addition to learn the genuinely different concepts. Distinguishing the similar stuff in new clothes from the actual new stuff is hard. Simply put, learning the second languages will be very hard as well.
Now your programmer has two data point, and knows with their gut that learning a new programming language is hard. This sets expectation, an will make it harder to learn additional languages. It will take some time to realise learning a new language, besides a few insane exceptions like C++, is not that hard.
> ignore the din of people on the internet who claim to effortlessly, expertly jump among 10 languages as part of their day-to-day.
Jumping from language to language may not be that easy. But one can certainly be an expert at 20 programming languages. Once you see the commonalities, there isn't much to learn. Really, a good course in programming languages is enough to get you started. The hard part is memorising 20 big programming frameworks, with all their warts, special cases and so on. Still, if you know the concepts, learning the vocabulary takes little time.
[+] [-] nekopa|10 years ago|reply
I am currently going through Learn Python the Hard Way and at the end of one exercise I was doing the extra credit stuff (research online all python formatting characters) and typed in the question verbatim, one of the top answers was from SO, went there to check it out and one of the first answers was:
"So you are going through Learn python the Hard Way and are to lazy to find the answer"
I mean, I understand some of what that person was trying to say, there is some basic etiquette you should follow when asking questions online, but I am very sure a beginner would not know it. And the question wasn't even "Tell me all the python formatting characters" it was asking where to find a list of them. Another answer did point them to the python docs, but I felt that whoever asked the question would be hesitant about using SO again when they have another problem, which is a shame.
[+] [-] dkersten|10 years ago|reply
The problem with this isn't whether or not that claim is true, but rather that (as you say) its not the right thing for a beginner. Its perfectly fine (desirable, even, IMHO) for an intermediate-to-expert programmer to know many (different) languages and use a few of them regularly (although perhaps outside of JS and whatever you use on the backend, probably daily is a bit much maybe), but this is all distraction for a beginner. Beginners should focus on learning programming concepts, not on learning programming languages (but one language is a requirement, otherwise you can't learn by doing!) Although, Zeds point about learning the basics of four languages still holds - you can do that and then focus on one.
[+] [-] natural219|10 years ago|reply
For example, a beginner might phrase queries like "jquery how do I make text appear on screen", whereas a more experienced person might query something like "jquery element insert value".
[+] [-] mcgrootz|10 years ago|reply
[+] [-] mcgrootz|10 years ago|reply
I'm totally behind this distinction, and I hope more content publishers adopt something like this.
[+] [-] rday|10 years ago|reply
Hindsight and all, it seems the book would have better titled "Learn to Program the Hard Way (using Python)". Or "Learn to Program the Hard Way (using Ruby)". A total beginner is really trying to learn how to build a program, not trying to learn a particular language (whether they know that or not).
[+] [-] huuu|10 years ago|reply
I think it took me three years to understand what a variable was. And I still don't know why it took me so long to understand and why I suddenly understood it.
It's not that I didn't know that assigning '1' to 'a' would result in 'a' having a value of '1', but I didn't understand the concept and workings behind it. I just thought it was magic.
[+] [-] uniclaude|10 years ago|reply
A Notable exception I found is "Learn you a Haskell for Great Good!". It is as good for beginning coders as it is for early (or advanced) ones.
The author made the effort to describe some relatively basic things, and it was simple enough (okay, with a few calls to me here and there) for an Art major friend of mine to start with programming, and with Haskell. I can't recommend this book enough.
[+] [-] morganvachon|10 years ago|reply
The problem is, I can't wrap my head around many of the concepts I read about here in the HN comments and elsewhere on programming blogs and such. No matter how much I try to understand it (and by understand it, I mean fully grasp what the person is talking about without having to look up every other word or phrase), I can't seem to put it all together. Things like inverted trees, functional programming (I've heard of Haskell and I'd love to learn it, but I have no head for mathematics at that level), polymorphism, and so on.
Maybe I need to just practice more; maybe I need to pick something interesting from Github and dive into the code to try to understand it better (preferably something well documented of course). Or maybe I need to just stop, and accept that I can whip out a script or simple web thingy if I really need to, and stick to being a hardware guy, which I'm actually good at.
[+] [-] r0mbas1c|10 years ago|reply
This stuck out to me as being just the beginnings of the quintessential issue:
But why stop at just "beginner" "early" and "advanced". All of the books I have on programming are either truly "beginner" or blankly labeled as a programming guide, when in actuality it is quite "advanced"...nothing in between.If, as the article states, 4 is the magic number of languages to learn up front, perhaps there should be a 4th level of programming guides....one for the journeyman who knows the syntax, can articulate the complex algorithmic issues that need to be addressed, but isn't quite at that "mastery" or "advanced" level.
[+] [-] chadzawistowski|10 years ago|reply
Code tags are okay if you manually line-wrap, or you can precede quotes with a > and everyone should recognize it for a quote.
[+] [-] veddox|10 years ago|reply
[+] [-] VeejayRampay|10 years ago|reply
[+] [-] jerf|10 years ago|reply
Obviously I can't expect to pick up a brand new technology and instantly expect to be a wizard, but I do expect that I can pick up a new technology and be functioning at a high level in a week or two, tops, because it's almost certainly just a respelling/reskinning of some technology I've used before.
(The whole "young guys who know way more than their old-fogey elders" was, in my opinion, an isolated one-time event when we transition from mainframe tech to desktop tech. Despite its recurrence on HN, I think "age discrimination" is naturally receding and will just naturally go away as the people on this side of that transition continue to age, and skill up.)
[+] [-] pech0rin|10 years ago|reply
[+] [-] merrickread|10 years ago|reply
The problem expressed in this article is a fundamental bottleneck of education. The communication between teacher and student is often misinterpreted at both ends and the subject matter is never perfectly conveyed or received.
I feel what really lacks in the learn to code community is teaching one how to actually learn. Lay a positive attitude towards failure and a framework of problem solving first, the content and understanding of a language will come after.
[+] [-] pbreit|10 years ago|reply
[+] [-] natural219|10 years ago|reply
One area I struggle with in tutoring is how to inspire/invoke/detect disciplined motivation. What I mean is, whenever I sit down to show someone something, I'm constantly questioning myself "wait, do they actually want to learn this level of detail, or am I just giving too much information that's going in one ear and out the other?" If someone is definitely motivated to learn that's great (and really inspiring for me as a teacher to do better at explaining things precisely).
If this nomenclature were more understood, I would like to say something like "Sorry, what you're trying to do is more of an early/junior task, and right now you need to stick with the Beginning basics". I just don't know how to phrase that without sounding condescending.
[+] [-] michaelfeathers|10 years ago|reply
Computing is so tightly woven into our world not that it's hard to find people who have more than a passing interest in it who have not find some way to try to code as kids. Even with those who haven't, there's a gulf between people who have tried to do a little HTML editing (and know what a file is) and people who haven't. There's no one place to start. From Zed's description it looks like he's starting from the lowest possible point, but what are the demographics like there? How many people are in that space and are they mostly adults or children?
I think this is one of the main reasons why you don't see much beginner's material.
[+] [-] MarcScott|10 years ago|reply
I often wonder about this. In the UK, with the drive to get every child 'coding', there are a large number of teachers that constantly talk about how the main skill that we should be teaching is 'Computational Thinking'.
I wax and wane back and forth over this topic, in a very chicken and egg way. However, I usually end up coming to the conclusion that learning computational thinking is great, but you need to know how to code (i.e. learn the basic syntax of a language) before you can possibly learn how to think computationally.
I would be very interested to hear the opinions of actual developers, as to their opinions on the topic.
[+] [-] NTDF9|10 years ago|reply
This means, an expert in one language is going to be "Beginner" instead of "Early" in some some ways...but "Early" instead of "Beginner" in other ways.
Anecdotally, as a software engineer working with C++, I had to spend a whole months trying to understand event-driven programming of other languages. I didn't really need tutorials on loops and recursions but I sure as hell needed to understand how a typical program in that language works.
[+] [-] smilefreak|10 years ago|reply
I am a instructor for Software Carpentry[1] , the goal of these workshops from my experience is to try and help mostly scientists get started on the journey to becoming early programmers.
In biological sciences with more and more data becoming available, the Expert blindness Zed speaks of is a major problem. We need to invent better systems and actually take heed of research based teaching methods as SW does if we wish to improve this situation.
https://software-carpentry.org/