There are plenty of 'non-programmers' that I have met that know how to write SQL and write small embedded programs for Microsoft Excel. Why did they know this? Because it what was accepted as a 'plus' (by whomever they worked for) to know how to do these things. (I imagine employees being empowered to do this cut down on the number of IT guys that the company would be forced to hire to do menial tasks others didn't want to/couldn't do.)
If these same people had the same expectations placed on them to know how to use emacs, vim, git, a terminal emulator, etc. to write scripts (or whatever) for what there job entailed, I firmly believe that they would have been able to do that just as well.
In their minds they would still not classify themselves as programmers (even though they would be programming). So is programming literacy actually down? Or is the perception of what programming literacy is, off?
Programming literacy would have to be geared towards personal empowerment rather than vocational training in a 'public education' scenario. Which is interesting because most programmers in the industry are simply people who came to enjoy the empowerment of programming (many times years before they ever had their first programming job). While other people (who don't enjoy that empowerment/activity as much) tend to not work in the industry (or even if they wanted too, they tend to not get hired).
This is what makes the world of vocational programming more a world of culture rather than strictly a world of able professionals (not to say that the people working in the industry aren't able). Come on, you don't honestly think that the Hollywood depiction of "hackers" is based on nothing do you?
The industrial gearing towards a 'programming culture' is also what keeps certain people away. Although I think a strictly 'neckbeard' culture (or however you want to define it) is slowly fading away over time.
All that being said, not everyone reads. And many people (to this day) have no desire to use that empowerment even if they are able (as in use the literacy they have to take the time to go out and learn/read about new/important things).
Logo is all the evidence you need that anyone can program. When I was in 4th grade our computer lab had a bunch of machines that ran an implementation of Logo. We were required to perform exercises in the program and then we could draw whatever we wanted with the time left in the session. By the end of the hour, there wasn't a kid in the room that wasn't imagining up something to generate with those krufty commands.
One potential problem when trying to execute that vision is that programming and writing are, in fact, two very different activities.
When you are writing, you can make spelling mistakes, grammatical mistakes, replace words you don't know by other words or even pictograms, etc. Writing is a fluid medium, and it is ultimately interpreted by other human beings, who react very well to that fluidity.
When you are programming, however, you are addressing a computer. Computers deal very poorly with ambiguity or approximation.
Perhaps things like IFTTT are the closest we'll be to "everyone in the world can program". Developing for the web requires understanding stacks and stacks of abstractions because if you want to interact with the web in a way that is more meaningful than posting comments on YouTube, you have to know how those stacks interact with each other.
The notion of "task oriented tools" is not new- but how many non-programmer Mac users have used AppleScript ever? If the author of the post had shown his friend how to use AppleScript to batch rename pictures etc., instead of Ruby, she wouldn't have asked "But what would I use this for?". But would have she used it? The closest we got to that would maybe be something like HyperCard.
I'm not convinced that this gap can be avoided, even though I would very much like it to not be there.
That being said, you can make the argument that people already know how to use computers: architects use CAD tools, photographers use Photoshop, accountants use Excel, musicians use Fruity Loops etc. In all those examples, they're using tools to manipulate abstractions that they could have never had 50 years ago. So maybe that's the form "everyone can program" takes.
These are good points. However, one of the main things that I've seen drive people from programming is frustration. Not just with setup tools but with their own code.
So I would add ineffective "error handling" as one of the main "gaps" that prevent people from trying anything beyond basic programming.
This makes sense. How learnable is an application which gives you unintelligible feedback when you use it incorrectly? Would your web app be as popular if it just showed users raw server errors when form inputs were invalid!?
Unfortunately, this is very tough thing to fix - it's difficult to explain to someone with some arbitrary level of knowledge why an insanely complex abstracted system isn't behaving in the way they expect it to. I think successful approaches might combine ideas from adaptive learning, omniscient debuggers, fast feedback editors, sophisticated type systems, visual languages, and even "realtime" stackoverflowish peer support networks.
It's possible to do much much better than what we currently have - but it isn't going to be easy!
Sometimes when someone asks me for advice for how to learn to program the conversation goes like this:
Them: I want to learn to code
Me: What do you want to code? A game? A robot?
Them: I don't know, I just want to program.
I think this is a bit like the issue this article talked about, but coming from the learner not the teacher. In a way people who want to learn need to have an opinion too!
My experience with this as a kid- I wanted to make games, but I didn't know how to program, so rather than looking for a way to learn programming, I looked for a way to make games without programming. The tool I used taught me programming anyway eventually, but that was my mindset.
...except that, with literacy, we don't talk about "task-oriented reading" or "no-fuss setup". The intrinsic value of literacy is widely recognized. Where resources permit, people are willing to spend time learning to read and write, and societies are willing to allocate significant resources to teaching these skills.
Why should programming be different? Is it really that much to ask for someone to invest (gasp!) hours of time setting up an environment? Is it unreasonable that a person might have to seek help getting started?
Reading is inherently "task-oriented" and "no-fuss setup". You're enjoying a story or learning something.
The equivalent of the problems in this article for reading would be trying to teach someone about the details of typesetting, ligatures, etc. or what sounds letters make, without applying it to reading real sentences.
[+] [-] StandardFuture|11 years ago|reply
If these same people had the same expectations placed on them to know how to use emacs, vim, git, a terminal emulator, etc. to write scripts (or whatever) for what there job entailed, I firmly believe that they would have been able to do that just as well.
In their minds they would still not classify themselves as programmers (even though they would be programming). So is programming literacy actually down? Or is the perception of what programming literacy is, off?
Programming literacy would have to be geared towards personal empowerment rather than vocational training in a 'public education' scenario. Which is interesting because most programmers in the industry are simply people who came to enjoy the empowerment of programming (many times years before they ever had their first programming job). While other people (who don't enjoy that empowerment/activity as much) tend to not work in the industry (or even if they wanted too, they tend to not get hired).
This is what makes the world of vocational programming more a world of culture rather than strictly a world of able professionals (not to say that the people working in the industry aren't able). Come on, you don't honestly think that the Hollywood depiction of "hackers" is based on nothing do you?
The industrial gearing towards a 'programming culture' is also what keeps certain people away. Although I think a strictly 'neckbeard' culture (or however you want to define it) is slowly fading away over time.
All that being said, not everyone reads. And many people (to this day) have no desire to use that empowerment even if they are able (as in use the literacy they have to take the time to go out and learn/read about new/important things).
[+] [-] DubiousPusher|11 years ago|reply
[+] [-] GuiA|11 years ago|reply
When you are writing, you can make spelling mistakes, grammatical mistakes, replace words you don't know by other words or even pictograms, etc. Writing is a fluid medium, and it is ultimately interpreted by other human beings, who react very well to that fluidity.
When you are programming, however, you are addressing a computer. Computers deal very poorly with ambiguity or approximation.
Perhaps things like IFTTT are the closest we'll be to "everyone in the world can program". Developing for the web requires understanding stacks and stacks of abstractions because if you want to interact with the web in a way that is more meaningful than posting comments on YouTube, you have to know how those stacks interact with each other.
The notion of "task oriented tools" is not new- but how many non-programmer Mac users have used AppleScript ever? If the author of the post had shown his friend how to use AppleScript to batch rename pictures etc., instead of Ruby, she wouldn't have asked "But what would I use this for?". But would have she used it? The closest we got to that would maybe be something like HyperCard.
I'm not convinced that this gap can be avoided, even though I would very much like it to not be there.
That being said, you can make the argument that people already know how to use computers: architects use CAD tools, photographers use Photoshop, accountants use Excel, musicians use Fruity Loops etc. In all those examples, they're using tools to manipulate abstractions that they could have never had 50 years ago. So maybe that's the form "everyone can program" takes.
[+] [-] avivo|11 years ago|reply
So I would add ineffective "error handling" as one of the main "gaps" that prevent people from trying anything beyond basic programming.
This makes sense. How learnable is an application which gives you unintelligible feedback when you use it incorrectly? Would your web app be as popular if it just showed users raw server errors when form inputs were invalid!?
Unfortunately, this is very tough thing to fix - it's difficult to explain to someone with some arbitrary level of knowledge why an insanely complex abstracted system isn't behaving in the way they expect it to. I think successful approaches might combine ideas from adaptive learning, omniscient debuggers, fast feedback editors, sophisticated type systems, visual languages, and even "realtime" stackoverflowish peer support networks.
It's possible to do much much better than what we currently have - but it isn't going to be easy!
[+] [-] samdroid|11 years ago|reply
Them: I want to learn to code Me: What do you want to code? A game? A robot? Them: I don't know, I just want to program.
I think this is a bit like the issue this article talked about, but coming from the learner not the teacher. In a way people who want to learn need to have an opinion too!
[+] [-] Rusky|11 years ago|reply
[+] [-] candu|11 years ago|reply
Why should programming be different? Is it really that much to ask for someone to invest (gasp!) hours of time setting up an environment? Is it unreasonable that a person might have to seek help getting started?
[+] [-] Rusky|11 years ago|reply
The equivalent of the problems in this article for reading would be trying to teach someone about the details of typesetting, ligatures, etc. or what sounds letters make, without applying it to reading real sentences.
[+] [-] wmf|11 years ago|reply