arakyd's comments

arakyd | 16 years ago | on: Lisp is sin

Yeah, he says that Lisp is too hard for the common man, and then sort of implies that this is because it doesn't use ALGOL derived syntax. I don't think that's the issue. It's not Mort who's fallen in love with ALGOL syntax, it's the CS graduate Elvis's who cling to their expensively acquired "skills" (priesthood memberships) with religious fervor (he says, as a CS student).

I program in C#, and lambdas and delegates are nice (I can't use expression trees because the company I work for standardizes on .NET 2.0). But the system is too gnarly, still verbose as fuck, and nowhere near as nice as Lisp (as Krishnan freely admits). This just increases the need for sophisticated developer tools and makes things more obscurantist, not less. C# is just another variation on Java, i.e. a way to drag Elvis types 10% closer to Lisp without depriving them of their oh so precious ALGOL syntax.

Fuck the priesthood, seriously.

arakyd | 16 years ago | on: Being poor at modeling is essential to OOP (comment on LtU)

The thing about OOP is that it is described as way to make programming easier because objects are so "intuitive," and the examples for this are all toy simulations with naive object "models." What the post describes, and what everyone who actually does a significant amount of OOP programming eventually figures out, is that the only things OOP makes intuitive are stupid approaches. It's possible to program well in OOP languages, but there's nothing intuitive about doing anything non-trivial well, in any language.

I blame Alan Kay for inventing a new paradigm and then not explaining how it was supposed to be used (it's like Lisp meets biology!), thus allowing it to be taken over by others who filled the vacuum with a multitude of their own (usually half-baked) ideas bastardized implementations, and "software architect" positions.

arakyd | 16 years ago | on: Hacking a Google Interview - MIT's guide to Google interviews

My favorite book of this type is Thinking Forth, available online at http://thinking-forth.sourceforge.net/. You will get a large dose of the Forth mindset, but it's mostly higher level design tips and not as specific to Forth as, for example, How to Design Programs (read: Functional Algorithms in Scheme). It's a fun read, and a great way to get a perspective on design that is different than what you get from the usual object-oriented or functional language centric sources.

arakyd | 16 years ago | on: Chuck Moore (Inventor of the language FORTH)'s blog

Is that Charley the Charley Shattuck mentioned in the 7/21 entry at http://www.ultratechnology.com/blog.htm? Looks like he's still working with Chuck. Lucky guy.

The best thing about Forth is that it's the distilled result of decades of iteration on a complete programming environment by a great programmer. I don't know of any other examples of this. The worst thing about Forth is that it will probably die with Chuck and be mostly lost.

arakyd | 16 years ago | on: For Today’s Graduate, Just One Word: Statistics

Second the idea that a lot of statisticians don't know what they're doing. The first mistake is to make things easier by glossing over, or mangling, the mathematical details. Many statistics textbooks do this, and the only real protection is to have a good math background. The second mistake is to treat statistics like a bunch of techniques to be learned and applied with little regard for the philosophical problems inherent in every attempt to model the real world.

The Cartoon Guide to Statistics is an excellent way to go from zero to a good overview of the basics with a minimum of hard math. After that, if you're mostly interested in applying basic techniques to your own stuff, you want a good undergrad textbook. I don't have any good recommendations here, unfortunately. If you have a good math background (or are motivated to get it) and you want to keep going, Statistical Models; Theory and Practice by David A. Freedman (http://www.amazon.com/review/R2XUNM92KYU7BB) has the math, the philosophy, the hands on analysis of studies, and the exercises to put you in a better position to evaluate statistical research than some people who produce it.

arakyd | 16 years ago | on: Just give me a simple CPU and some I/O ports

It's not a fundamental difference, it's a consequence of the fact that hardware people are at the bottom of a very big stack and have a massive financial incentive to be as solid and predictable as possible. Higher up the stack everyone prefers to use relatively cheap programmers and build stuff quickly.

The problem is not having to deal with software APIs, the problem is the sheer size of the stack and the sheer number of accumulated assumptions that are built into it. Moving more pieces into hardware might improve the stack's overall integrity and reduce bugs, but it won't do much to reduce the size.

The real issue, IMHO, is that no one wants to admit that the general reuse problem is hideously, horrifyingly difficult. The biggest problems it causes are diffuse and long term, and in the short term everyone can do things faster by hacking together their old code with someone else's 95% solution library, so that's what everyone does. Putting enough thought into each new application to really do it right tends to be hard to justify on a business level, and most programmers have neither the inclination nor the skill to do it anyway. It's so ingrained that even people who are frustrated with the way things are think that a different operating system or language would solve the problem. It wouldn't - it would only start the process again, with at best a percentage reduction in stack size and corresponding percentage increase in time to frustration. I think it boils down to the fact that code reuse is basically a 2^n problem, and the bigger and more opaque the stack gets the harder it is to cheat that 2^n.

The only potential solution I've seen is what Chuck Moore is doing with Forth chips. He's now at the point where he can design and manufacture chips that are cheap and simple in design but are very good at running Forth very quickly. Of course the tradeoffs are (perhaps necessarily) as horrifying as the reuse problem in that it demands a lot more from programmers in general, and in particular requires them to learn a radically different way of doing things than they are used to while at the same time strongly discouraging reuse at the binary and source levels. In other words, he's spent decades designing a small core language and hardware to run it, and that's really all you should be reusing (along with data transfer protocols). Needless to say, no desktop or web or server programmer (or said programmer's boss or company) is ever going to go for this unless problems with reuse become far worse than they are now. (Even then the supply of cheap programmers might grow fast enough to keep masking the problem for a long time.) Most programmers are not very good, managers like it that way, and most of the smarter programmers are nibbling around the edges or looking for a silver bullet.

In short, there are no easy solutions. If you don't like the direction software is going, think about becoming an embedded systems programmer.

arakyd | 16 years ago | on: Confessions of a Math Idiot

Yeah, I tend to say "computer science" when I mean "theoretical computer science." To me there is a clear split between the stuff that's math/logic and the rest of it which I'm happy to lump under the catch-all of engineering (and don't feel comfortable labeling as any sort of science).

arakyd | 16 years ago | on: Confessions of a Math Idiot

It's not "different strokes," it's a change of perspective causing massive confusion all 'round. Computer science isn't a knock-off approximation of mathematics, it's a subfield of mathematics. Programming, which is what the post is talking about, is a third thing. The derivative is the inverse of the indefinite interval (and has the same arity), but the function he refers to computes a definite integral which is not the same thing. Etc.

Confusion about this stuff is a sign that you didn't really understand it the first time. Believe me, I've been (am still) there...

arakyd | 16 years ago | on: What You Learned About How Planes Fly Was Probably Wrong

It's true that the usual explanation with the Bernoulli effect is wrong, but the idea that the Coanda effect is important for normal flight is also wrong. The relationship between wing camber, angle of attack, and lift is a little more complicated than the post, or the presentation it references, implies. See http://av8n.com/how/htm/airfoils.html#sec-other-fallacies for more details.

A few planes have been built with the Coanda effect in mind (http://en.wikipedia.org/wiki/Boeing_YC-14), but in general it isn't very important.

arakyd | 16 years ago | on: Nobody Hates Software More Than Software Developers

I don't understand your analogy. Is the Wizard of Oz repulsed because he thinks the illusion is bad, or because he knows how it's made? Would "This is the worst hunk-a-junk implementation of an illusion I've ever seen?" be an accurate interpretation of what he means?

Jeff's post and your comment both confuse me on this point - I can't figure out when either of you are talking about quality of implementation (when I read "code" I think implementation) and when you're talking about quality of user experience. I'm not even sure if you and Jeff are saying the same thing. Jeff spends most of his post talking about bad user experiences caused by programs written by people who are probably not experts, but then he ends by saying that all programmers should hate their own code. I, and most people on HN apparently, interpret this to mean that we should all hate the quality of our implementations. Not only is this weird on it's face (surely many people who read Jeff's blog, let alone HN, have seen worse code than what they usually write themselves), but it doesn't seem to follow from the rest of the post. My reading of your comment leads me to believe that you dislike all code/implementations on general principle, but I can't tell if you even refer to the user experience at all. The second to last sentence seems to, but I can't figure out how the rest of the comment relates to it because the rest of the comment seems to be purely about implementation (except for the reference to the Wizard of Oz, which is completely ambiguous in that regard as far as I can tell).

My experience seems to be different than yours. I might have thought of software as magic when I was exposed to computers for the very first time, but as soon as I learned a little bit about how they worked none of it was really "magic" any more. A good user experience might be smooth, or quick, or intuitive, or even delightful, but I don't see it as magic. Occasionally I see something and I don't know how it was done, and it that case I want to know. I want to look under the covers, just like I wanted to know how computers worked. The connotations of "magic" hardly arise, because I immediately move past the "I can't believe that works!" stage to the "how does it work?" stage. I assume (or in the case of computers, know) there is machinery under the covers even before I see it, and I want to see it. There is nothing necessarily repulsive about the machinery to me, and in fact the machinery has its own beauty that is largely independent of the user experience. My ideal piece of software delivers a fantastic user experience with beautiful machinery, but I understand that beauty in one part does not necessarily imply beauty in the other. The user experience is generally more important than the quality of the code, but there is no law that says code has to be repulsive. We seem to disagree about that.

I would not say that my own code was the worst I've ever seen, but I am always dissatisfied with it and with the user experience it delivers because I always think it could be better. I completely agree with Jeff's Parnas quote, and I also agree that Jeff is probably an incompetent programmer in the judgement of Parnas and others like him. So I maybe I agree with Jeff's intended point. On the other hand, I think Jeff has delivered some pretty good user experiences, and I think many programmers who really don't produce great code and/or create a net increase in programmer demand can and do deliver decent, or even fantastic, user experiences. (This is probably one of the reasons why we have so many incompetent programmers and don't live in Parnas's ideal world. Relatively incompetent programmers can still deliver good user experiences, so in the short term it's expedient to deliver software that way. In the long term, we have a shit ton of deeply stacked shit and all the really good programmers are spending their time making shit shoveling tools instead of advancing the state of the art, but hey. Just another tradeoff.)

Speaking for myself, I know I'm an incompetent programmer who would not have a job in my own ideal world, but my goal is to not be that guy, and I would prefer not to hire that guy, or the guy who has never seen worse code than his own, doesn't make a clear distinction between his own level of skill and the level of skill displayed by crapware developers, does not aspire to write anything terribly ambitious from a technical standpoint, and in fact has nightmares about it. I enjoy programing, and I am more suspicious of people who say they hate software than I am of people who say that their own code is not the worst they've ever seen.

arakyd | 16 years ago | on: Swoopo: The crack cocaine of auction sites

All religious traditions take guidance from more than just holy books (many religions do not even have holy books), and the idea (or assumption) that laypeople can determine religious truth by reading the holy book themselves is pretty much restricted to Protestant Christianity.

The fact that Swoopo promotes itself as an auction/shopping site and lists "strategy" as part of the experience skates uncomfortably close to the "no fraud" clause listed in that link, although offering discounts to losing bidders would seem to give them a solid "it's only gambling if you choose to treat it that way" out.

arakyd | 16 years ago | on: Swoopo: The crack cocaine of auction sites

What about if nearly everyone is worse off? Putting the burden of responsibility on the actor is fine, but Swoopo is not in the same category as Starbucks and EA, and claiming that everyone participating in a capitalist economy does so at the expense of others is pretty disingenuous.

arakyd | 16 years ago | on: Brainfuck compiler in brainfuck

Yeah but that doesn't explain why one would write a brainfuck interpreter in brainfuck.

Day 163: Project complete. Strength nearly gone. Must.. complete... evaluation.... Data structures, godawful... string handling, non-existent... writing a brainfuck interpreter, gigantic pain in the ass.... Conclusion: this better get me laid. Oh sh.. signal lost

arakyd | 16 years ago | on: Brainfuck compiler in brainfuck

Considerably less fun to toy around with now that the only way to get a non completely crippled version is to pay megabucks (or work at a company that develops in it). If I'm wrong, please please clue me in...
page 1