(no title)
masak | 10 years ago
It's a bit more thrilling to think of it that way, as a matter of scale. When space and time resources are scarce, thinking of the computer as a big piece of RAM that we mutate in-place makes the most sense. As we push upwards and outwards into enough resources and a bigger need for parallelism, it suddenly makes more sense to switch perspectives and reason in terms of binding and substitution.
marssaxman|10 years ago
You can see the same pattern with IO models. We have programming languages descended from 50s-era concepts of computing in which the program drives the machine, but that just isn't true at all. A computer is a passive machine, not an active one: it reacts when you poke it with an interrupt, until it reaches a steady state and settles down again. Operating systems go to a great deal of trouble to simulate the kind of top-down flow control environment our imperative tradition wants to think it is operating in, but in order to get any real performance out of these systems, we all end up building asynchronous, reactive layers on top of the simulated batch-job anyway.
We'd all be better off if we flipped the paradigm, imagining the process of building a program not in terms of writing instructions for the computer to perform, but in terms of creating a structure which will react appropriately in response to whichever events may be brought to its attention.
Instead, we'll probably still be starting young programmers out with for-loops counting from 1 to 10 and printing the result on an imaginary console for decades to come, even though none of that is even remotely relevant to what's actually happening anymore, and once they've crossed that hurdle they'll promptly begin unlearning all that stuff in order to start getting real work done.
On an unrelated note, I spent a while last fall/winter playing around with the idea of a very low level functional language - could we use these techniques in memory-constrained environments too? I didn't find the model I was looking for, but I think it's probably out there, and it would be very interesting to develop a functional/type-safe/immutable language suitable for microcontroller programming, with no garbage collection or implicit allocation. It may be that the Turing model is a better fit for computing in the small, but I suspect that we may find dataflow and functional composition to be useful at all scales. That way of looking at the world has some profound philosophical strength, after all - "you cannot step twice into the same river" and all that.