top | item 6075506

Bill Gates on the future of education and programming

54 points| wx196 | 12 years ago |gigaom.com | reply

31 comments

order
[+] zanny|12 years ago|reply
Now this is completely hive mind, unnecessary, off topic, and not really related to the OP, but:

> I frankly don’t see that much of a downside.

I think Windows taking over the world from 1995 to 2008 has absurdly destroyed lot of hacker interest in computing devices. I'm talking about those born around 93 - 96 that grew up during complete Microsoft dominance. There is no (good) terminal, until recently the development environments and toolchains were behind paywalls or not included, and terrible habits like "reformat when something breaks" emerged because of how undocumented and malignant a lot of DOS / NT's behavior acted. By raising a generation on closed platforms, they completely avoid realizing the inherent mutability of internal systems in these devices, and I think this promoted a huge amount of the computer illiteracy we see rampant today. Microsoft did give people what they wanted - brainless easy computing that takes no thought and was effectively consumable and disposable - but at the cost of a lot of engineering potential if they had distributed a tinker-able sandbox rather than a black box. Rather than be knowledgable about the workings of their devices (which are more and more taking over their lives) they are dependent on them but know nothing about them besides how to smack the keyboard or tap the facebook button.

Bill has done a lot of good in education outside this, but the undercurrents of the Microsoft takeover of consumer electronics for 2 decades will have lasting negative implications on computing for probably an entire generation. We don't know what the alternative might have been, but I know from my peers (I'm 21) there is an absurd amount of illiteracy and apathy to these devices because they were raised on Microsoft products and expect it to work or just replace it, rather than hack it to fix it. This doesn't even start on how the majority of web devs seem to be 25 - 40 explicitly because they grew up on netscape, telnet, etc and not IE. I see a firm line right around where XP came out when the entire browser space collapsed into IE where anyone currently 15 - 20 I know had a significant drop in web tech interest as a result.

> “Anybody who thinks getting rid of [patent law] would be better … I can tell you, that’s crazy,” Gates said. “My view is it’s working very well.”

Patents seem to still work (due to their short duration), so I'm not arguing patents, but copyright has destroyed a supermajority (I see estimates in the ballpark of 95%) of media and content created for the last hundred years because it all died and all copies were lost while still outside the public domain. There is a reason all modern media takes its roots from 16th - 19th century media - that is the only place you can reference without landing in a lawsuit minefield.

However, I see no reason at all why all this nonsense can't be abolished and culturally we could move towards a systemic crowdfunding approach where people propose ideas, everyone invests in the creation of their ideas, and the result is inherently public domain. The creator eats, the public benefits from any idea someone may have, and we don't end up with a huge fraction of culture and innovation lost under a rug of time.

I love Bill Gates for the good he does with his money, but I'm not going to blindly agree with him just because hes a genius or because hes rich and popular. I think Microsoft had a lot of systemic societal damage, and that IP law is completely out of control and unnecessary in this day and age.

[+] derefr|12 years ago|reply
You are making a classic economic error, here, of seeing a company with good marketing that grew huge by selling people something, and assuming that it is that particular company's fault for causing the market to want that something. A much simpler explanation for the evidence is that people already did want that thing, but had nowhere to get it; and that the company grew huge by responding to untapped demand.

People, by-and-large, like magic. People want appliance-like technology that they don't have to understand. It is not Microsoft's fault (or Apple's, or Google's, or IBM's, or Nintendo's, for that matter.) People have wanted appliance-like computers since before there were computers.

The computer-like devices in speculative fiction novels from the early 20th century (written by, and for, the nerdiest of nerds) almost never resembled "general-purpose computers"; they mostly envisioned something roughly combining Skype, a wiki, and Siri. Englebart's Mother Of All Demos is a good example of what people were dreaming about: fluid communication and information sharing/retrieval. "Programmability" was never really a feature given much concern, even by the most idealistic futurist. People just don't want programmability. They don't lust after it like they do other whiz-bang features. They don't dream of a world where your tablet can be used to write programs that run on your toaster.

As for why...

As far as I've observed, programming a computer is, to most people, a mind-numbing consideration; a complex task, with a lot of rules to understand, facts to memorize, considerations to hold in one's head, and requiring heavy rigor. It's Engineering, in other words.

Most people aren't engineers; there is a specific mindset required, and most people don't seem to like being in that mindset, let alone staying in it for long enough to get good at using it. It seems to come down to "logical thinking" (or "System 2" thinking[1], or "Formal operational" thinking[2].) From what I've seen in trying to teach people to program, for most people, it hurts--possibly as far as causing an actual headache--to engage in rigorous, conscious logical analysis. People both try to avoid it for as long as possible, and to escape it as quickly as possible once they've started.

We--I don't know which "we", maybe "hackers" or "engineers" or "nerds" or some other term--we here don't seem to have this problem. Logical thinking comes simply to us; we get jobs requiring almost constant use of it, and then we engage in it in our off-hours as well. We might never stop using it. We are a minority.

This is why I say that "programming literacy" is a bad metaphor for becoming skilled at programming: "literacy" is something that (excepting learning disabilities) everybody is equally capable of attaining. Imagine if 90% of the population--forced to practice reading for years--could still only manage to read five or six words per minute, and doing so for 20 minutes or so caused them to need to take a walk outside to "clear their head." That's what happens to the "bottom hump" of the bimodal distribution in a CS101 class--and they're the ones prequalified by being in a CS101 class; the population at large does even worse.

So, there is no reason the average computer needs (serious) programming tools pre-installed on it, since there is no reason to expect the average person to want to be a programmer. Sure, programming tools should be made available to anyone who wants them, from a young age--as should supplies for any other craft a child might want to explore (paints, a keyboard, clay, a chemistry set, etc.) But every computer these days has the internet--and most of them even have app stores!--so it's easy enough to find, and then download, something like Shoes or Scratch if you want to play around and learn. I could even see including a learning environment like that (or maybe something more like Hypercard; if only~) with the OS, for the same reason Windows ships with MSPaint, and OSX ships with Garage Band.

But shipping Visual Studio, or XCode, or even gcc, with a computer, is more like giving everyone AutoCAD than giving them MSPaint. It's silly and overkill: only 0.1% of the population would even know what it was if they accidentally managed to get it open. Of those, most of them wouldn't be at a level to understand it anyway; they would need something easier to start with.

(All of this is separate from the "the system relies on the compiler toolchain to build or update parts of itself, so it needs to be there" argument. You might need to ship gcc with your OS, and might even use it to compile packages the user installs (i.e. in Gentoo), but that doesn't mean the user should be aware of--or care--that compilation is going on. Nobody notices that the .NET framework ships with a compiler (ngen.exe) which is then used to locally compile all the .NET stdlib modules in the background after installation; it just happens. You don't think "I have a CLR compiler on my system" when you install the .NET framework; you only think that once you've installed MSVC, because that gives it a user-accessible UI.

The same goes for "system runtimes"--the only thing anyone is recommended to do when their OS includes a copy of Ruby (usually for the sake of executing system scripts written in Ruby, not for the user) is to install a fresh copy through RVM and avoid the "system Ruby" like a plague. If it's so distasteful to the language's programmers, why are we as OS builders even exposing it to them? Stick the binaries in sbin instead of bin; that's what it's for.)

[1] http://en.wikipedia.org/wiki/Dual_process_theory

[2] http://en.wikipedia.org/wiki/Piaget's_theory_of_cognitive_de...

[+] johansch|12 years ago|reply
I disagree.

As a counter-example: I am 36 years old and have worked professionally creating software since I was 20. Sure, I tinkered with BASIC programming on my first computer (the fabulous ZX81), but I first really got into programming in the early 90s with Turbo Pascal/Turbo Assembler in MS-DOS. In my world, Microsoft was completely dominant at this time.

My own little pet theory on why younger people don't seem to do as much programming:

Being online and having so easy access to so much information, entertainment, communication etc has generally made people more impatient and less willing to really commit to the kind of single minded focus that is needed to get into and really develop programming knowledge.

[+] dlitz|12 years ago|reply
I agree.

Since the era of Microsoft dominance, we've done a fairly good job of selling appliances and single-purpose software & services, but I think we've failed to sell general-purpose computing to the public. We've barely even tried.

It's a shame. Not only are most people missing out on the amazing power that general-purpose computing brings, but we're now in a situation where people are willing to cede that power to copyright maximalists.

Arguably, "the coming war on general-purpose computing" wouldn't even be a possibility right now if the industry (led by Microsoft) hadn't raised such high barriers between "developers" and "users".

[+] Someone|12 years ago|reply
"We don't know what the alternative might have been"

I can make a guess: if Microsoft had refused to give people want they wanted, somebody else would have. After all, Windows more or less started out as a "copy the Macintosh, but keep backwards compatibility" exercise. IBM could build that, too (and did, with OS/2, which was even more closed than Windows) and chances were that Digital Research [edit: make that Novell; they bought DR in 1991] could have, too. What could Microsoft have done about that? Stubbornly selling something that users do not want would, IMO, just have led to a different winner.

I also fail to see why you would ever expect the average computer user today to be as knowledgeable of computers as that of 20 years ago. The demographics of 'computer user' have changed enormously over that time period.

Similar things have happened everywhere. We have people washing clothes who could not build a washing machine, people doing a bank teller's job with the help of an ATM who know nothing of double entry bookkeeping, people driving cars who don't even know what engine knocking is, people heating their home who don't even know how to make a fire _with_ a match, etc.

And yes, I think people who don't really know how their system boots or how a file system works miss out on a lot, but then, I don't really understand the underlying physics either.

[+] paganel|12 years ago|reply
> I think Windows taking over the world from 1995 to 2008 has absurdly destroyed lot of hacker interest in computing devices.

I'm 32, so not really falling into your description when it comes to age, but if it weren't for Linux I wouldn't be here today writing code for a living. I was around your age and I was almost illiterate when it came to programming (but very good at maths and all), until I installed a Mandrake Linux distribution on my PC and then discovered Python.

Like you said, even having access to a decent computer terminal can make a huge difference, the difference between just typing "python" and having access to its REPL and on the other hand struggling to set and export some crazy PATHs in DOS (and to this day I'm sure I discovered Copy/Paste into the DOS terminal by mistake).

It's good that Bill Gates does the things he does now and I understand why people would look up to him, but we should also think of the huge opportunity costs that Microsoft's policies imposed on the rest of the world.

[+] jiggy2011|12 years ago|reply
Microsoft shipped development tools with it's OS for as long as I can remember using them. Typically tools with very low barriers to entry too, they just worked out of the box and usually came with sample code.

DOS shipped with QBasic, Windows 98+ shipped with VBScript. Win 98 even included a small web server with ASP support and databases via Access.

Not to mention Visual Basic, which yes was technically not free but I don't remember it having meaningful copy protection and pirate copies were handed around in geek circles like photoshop is today.

There were also a bunch of third party drag and drop game creation tools available, many of which were scriptable in some capacity.

[+] cheesylard|12 years ago|reply
I disagree completely, I fit that demographic that you say is "computer illeterate" (I'm 19) and I have been interested in programming since 12 years old. Microsoft wasn't the only influence during that time. Valve was too.

I learned most of my programming through the wonderful tools that Valve gave its users for the Source engine. If it weren't for Valve, I probably wouldn't be interested programming today. My family has no computer background at all. I learned this all on my own. My dad didn't even know what Windows was until I explained it to him.

The point I'm trying to make here is that sure, Microsoft sucks, but you can't blame a single corporation for a trend so massive. There are other platforms. There WAS a good terminal. The in-game Counter Strike: Source console. And Valve's hammer editing is and was incredible for level editing. And have you seen Garry's Mod modding infrastructure? It's all open source (using Lua), and the community is super supportive and helpful. You're just looking in the wrong places.

[+] Arun2009|12 years ago|reply
I think what we need today are competent generalists - super-teachers if you will - who have deep (i.e., at least post-graduate level) proficiency in what are currently considered separate fields such as Physics, Philosophy, Mathematics or Biology. There's a goldmine of new breakthroughs waiting to happen at the intersection of disparate areas of knowledge. Generalist teachers/professors at the first/second year university levels can help the next generation be more adept at recognizing the connections between fields of study. Currently unfortunately you are not recognized at higher-rungs of academia unless you super-specialize and churn out papers. We really need to acknowledge generalization as well, even when it happens at the cost of any original contributions.
[+] helloTree|12 years ago|reply
This.

This is also an argument against doing a PhD where you are expected to focus on a very narrow field. The problem is that we want to measure academic progress by numbers in form of papers so scientists need to publish a lot otherwise they run out of funding. But if there is always this pressure to publish there is less time to step back to see the big picture.

[+] zdw|12 years ago|reply
> "A skeptic might say that’s like robbing from the not-so-rich to give to the poor."

More like 'indoctrinating 90% of the population in one sphere to demand the substandard, then profiting massively, then flailing around wildly with charitable work to try to cover your shame"

[+] calibraxis|12 years ago|reply
Personally, I just see that we're in a world where super-elites decide when and how to allocate resources. Many don't have a right to life, except to the extent that it fits in a framework acceptable to elites.

For instance, with healthcare, Gates explicitly supports intellectual property regimes, like medical patents. This is in keeping with his decades-long ideologies, and vital to how he went beyond his wealthy parents. (http://newint.org/features/2012/04/01/bill-gates-charitable-...)

Or with education, helping push privatized schools. (http://en.wikipedia.org/wiki/Bill_%26_Melinda_Gates_Foundati...)

I'm not really discussing Gates himself. (Despite him putting his name in the forefront of his actions and press releases.) I imagine his charity is far better than how the Koch brothers allocate their wealth. Rather, these points apply to depending on the benevolence of elites in general.

[+] aristidb|12 years ago|reply
Kind of funny that Bill Gates would have reverted his position on patents. (But maybe Gigaom is misquoting there, maybe he is referring to "intellectual property" in general.)