top | item 10755250

Unix and Flow-Based Programming

49 points| nprincigalli | 10 years ago |groups.google.com | reply

25 comments

order
[+] SixSigma|10 years ago|reply
> Then Tannenbaum came out with his book and it didn’t really take off.

> Then Torvalds open-sourced linux and that is what started to gain popularity.

cough BSDi cough

[+] nickpsecurity|10 years ago|reply
To support your point:

https://web.archive.org/web/19990224090656/http://www.bsdi.c...

Seeing the list, now I'm sure which BSD that SCC likely used in their Sidewinder firewall. They modified it to have mandatory access controls to contain breaches of subsystems. This was before SELinux, etc. Never used BSDi, though, so can't say much about it except its users were getting pretty of mileage out of it given rep of products based on it. Least in appliances.

[+] bitwize|10 years ago|reply
Well yeah, but that's not really how complex systems are developed anymore. Modern software development uses the object as the modular unit of granularity, not the process -- and APIs, perhaps augmented with message queues, for communication rather than pipes, sockets, and file descriptors. This is because it's much easier to statically reason about objects with well-defined APIs and they can be composed more flexibly with appropriate fabric: interacting locally or across process, user, or system boundaries.

This model dates back at least as far as Smalltalk but what really caused it to take off was -- wait for it -- Windows COM. So modern development has moved on from the Unix philosophy and embraced the Windows philosophy.

[+] chubot|10 years ago|reply
Uh, this is crazy wrong. Maybe in the desktop era objects were more important, but processes have come back with a vengeance now that everything is a distributed system.

I'm not sure what you mean about objects being composed across system or process boundaries. Name a successful system where that's true. Who uses DCOM? Distributed objects failed. I occasionally see people trying to bring back this way of thinking (e.g. they want methods instead of RPCs and protobufs), but they have not succeeded, for fundamental reasons.

In reality it's not either-or. You need both ways of thinking. Objects are static (they exist as an agreement between the compiler and programmer, and are usually thrown away at runtime); processes are dynamic. Many programmers that think only in terms of objects learn the hard way that their programs are brittle and inefficient. System operators think in terms of processes, and this way of thinking is essential for resilient systems.

In addition, objects are losing importance in distributed systems because state in a single machine's memory is not very useful. In real distributed systems you need replicated/resilient state with varying consistency guarantees. An object doesn't help you with any of those things.

[+] teddyh|10 years ago|reply
> I was going to mention Unix “cat” but forgot.

> Wildly simple - it takes at most one argument.

That is asinine. Why would a program be called ‘cat’ if it can’t concatenate multiple files?

Also, his “grash” program is seriously deficient in the handling of signals – it completely ignores the issue. There are many subtle issues with signals which have to be handled correctly when writing a shell: http://www.cons.org/cracauer/sigint.html

[+] throwaway999888|10 years ago|reply
> That is asinine. Why would a program be called ‘cat’ if it can’t concatenate multiple files?

Because he only thinks it's the "echo file to terminal" command.

The main use of cat was mentioned in the Programming in the UNIX environment article.

> The fact that cat will also print on the terminal is a special case. Perhaps surprisingly, in practice it turns out that the special case is the main use of the program. [...] But what about -v? That prints non-printing characters in a visible representation. Making strange characters visible is a genuinely new function, for which no existing program is suitable. [...] The answer is ‘‘No.’’ Such a modification confuses what cat ’s job is concatenating files with what it happens to do in a common special case - showing a file on the terminal.

http://harmful.cat-v.org/cat-v/unix_prog_design.pdf

[+] epistasis|10 years ago|reply
"cat" sounds like the abbreviation of "catenate" not "concatenate". Perhaps its possible that later versions of cat allowed more than one argument?
[+] yarrel|10 years ago|reply
cat takes as many files as you pass it, yes.