Kirby's comments

Kirby | 16 years ago | on: Paul Graham On Two Kinds of Programmers and Painters

Of course, in reality, most people aren't exactly these two extremes, great implementer and great innovator. Everyone has both traits in different amounts, and if you have a functioning team, they can be split up unevenly.

In this way of looking at things, I'll freely admit to being more of an implementer. If someone asks for what they want, I'll do my best to make the software do it in a way that's efficient and delights the user.

Given vague instructions, I'll do my best, but with mixed results.

But when I team up with someone who always has new ideas - I can actually sort through them, figure out which ones will work, which ones will meet the goals, and synthesize them into great product. Most people can't do that with their own ideas.

I think the original quote - and it's short, not a full essay, so this criticism is mildly unfair - elevates the innovator too much over the implementer. When in reality, you win when you have both, that know their own strengths and limitations, and are grateful for the other. I produce better work when I pair up with an innovator, and so does he or she. And the two of us will dominate over a dual-innovator team.

Also, realize implicitly in everything Paul Graham says, you can add the words, "For a startup technology company." That's what he knows, that's what he values. There's a lot of work out there that needs implementers, and innovators would be frustrated, unsuccessful, and miserable at. Don't feel threatened if you're not Paul Graham's Ideal Entrepreneur/Programmer. I'm not. I'll never be extremely rich, most likely, but I'm happy, good at my job, valued by my company. I say this because opinions like this caused me a _lot_ of self doubt in my early twenties, and they turned out to not be the accurate predictor of DOOM that I feared. If you're smart, and willing to do a good day's work, success is out there - not at a company run by someone like Graham, but he wouldn't have success at a company for you either.

Kirby | 16 years ago | on: Why I was tempted to discriminate against women

I'm more tempted to think of people as individuals who make their own life choices, rather than think of all women as 'pregnancy risks'. Especially in technology - a lot of geek women don't have or want kids.

And as an employee, I want at least the illusion that the company cares about us having fulfilling, valuable lives, rather than as producers in a harsh economic calculus. The same way they want me to think of them as something more than a paycheck. This kind of thinking is the fast track to having mercenary workers who constantly jump ship to other companies.

So glad I don't work for this guy.

Kirby | 16 years ago | on: Xkcd - story of our lives

I think there's a reason to post this one in particular - to set up a comments thread on HN about it.

I kind of thought it had harsher implications for Academia - for every self-congratulatory paper of something new, it's already been done a hundred times in the "Real World" in an unheralded source control repository.

Kirby | 16 years ago | on: Are Google employees being discouraged from using Python for new projects?

Also, keep in mind that you have to get _very_ big before this becomes an issue. If you're at Yahoo, Google, MSN - yes, language issues can become a performance design consideration.

If you're at merely a big site, like Ticketmaster, IMDb, or Livejournal, with good software design you can handle a lot of load with reasonable responsiveness. (All those three sites are written in perl, in fact. I've worked for one of them.)

If your page views per day on your project aren't peaking in the billions, you're probably better off optimizing for the language that your team is most competent in.

Kirby | 16 years ago | on: Python language moratorium is accepted

I kind of am, but I think in reality perl and python appeal to a different enough mindset, and solve similar enough problems, that there's almost no real competition between them. Most people will instantly like one and loathe the other based on their opinions on compiler-mandated style. (It's not just that, but it's an immediate thing you encounter and exemplifies the philosophy through the whole language - the One True Way to do things, uniformly and consistently across programmers, or There's More Than One Way To Do It, the language accepting as many alternate ways as possible and adapting to the programmer.)

There exist performance benchmarks, but I tend to think they continue to trend close enough together to never be a more compelling argument than the underlying design principal, for people choosing a primary language to work in. Similar for featureset - comparing modern releases, there's a vanishingly small set of real differences, especially if you're willing to consider best practice modules freely available. (IE, yes, we're sorry about Perl's built-in object model, but really, we fixed that in libraries years ago, seriously guys, years.)

We can band together and agree that we hate java more than each other, though.

Kirby | 16 years ago | on: Why kids don't program

In reality, I'd wager there's a _lot lot_ more kids programming today than 30 years ago. How many people had a computing device at home back then?

This anecdote makes the point that a very specific kind of thinking was better suited by the tools of the late 70s, and maybe - I've never been much inclined towards it, so I dunno.

And what we consider 'programming' is blurrier than it used to be. Is HTML programming? What if you use a fancy editor like dreamweaver? Only when it becomes dynamic? Does CSS count? A lot of youngsters, when they aren't getting off my lawn, have done web stuff, and the fact that it's a gradual shift from being a user to being a programmer probably makes it even easier.

I find the whole premise of this blog post flawed. (Great Scott! Someone disagrees with someone else on the Internet!)

Kirby | 16 years ago | on: What every developer should know about time

I've had more traumatic code errors due to times than I care to admit.

The bottom line: Learn from Unix. Store everything in epoch time (even if you're in Windows.) Do all your math in that. Showing local time is for display _only_. You won't ever regret it. (Well, hardly ever.)

Kirby | 16 years ago | on: Ask HN: What was the worst bug you've ever solved ?

For a different definition of worst:

I started a job recently at an ecommerce company. There was a long-standing bug with the cart display in the upper right of the page always saying that the cart was empty. People would report it all the time, and the quite smart lead programmer said it was something really complicated that he hadn't had time to investigate.

But eventually, after I sort of knew my way around the code, and when I finished up all the tasks on my to-do list, he handed me that as a why-not-investigate sort of thing. He didn't really have any idea, just that the previous long-gone coder had said it was complicated and in the depths of the way the front end code interacted with the order system.

So, I reproduce it, and look at the template code. These two lines, right next to each other:

[% cart_summ = ourdb.cart_summary %] [% IF cart_cumm.qty > 0 %]

Note that the two variables don't match. And this was broken on the site for _FOUR YEARS_. And nobody looked because someone said he had and it was hard, and nobody had time for a hard problem.

facepalm

Kirby | 16 years ago | on: Yahoo’s $3.65 Billion Mistake: Yahoo Closes GeoCities

Perhaps, but the perception of learning HTML being a barrier to entry was much smaller back then. People were excited to be part of something new, and didn't really balk at having to learn something. Totally different mentality today.

Kirby | 16 years ago | on: Ask HN: Seeking advice from programmers for non-programmers

The typical view of a business type by a software engineer is not pretty. It gets pretty close to, we do all the work, they do little but interfere with us, and then they take all the money. So it's an uphill battle to have a good relationship. (The _truth_ of this view varies wildly from company to company, but the perception is always there.)

See the movie "Office Space" for details. :)

Things that are a good idea: * Your technical staff is probably very smart, and not just about their jobs. Listen to them. They're particularly good at working with data. If they say that your market research doesn't sound right, even though that's your job and not theirs, it's worth going through it. Geeks are generally happy to learn things, too, so if you're actually right and can show it, we really like that too.

* Give your geeks the big picture. It really does help, because we're constantly making long-term tradeoffs, and if we know where the company is going, we'll be less likely to need to say, "Uh, we need to do a massive project for that feature, sorry!"

* Try to give clear requirements and make changes early. It's not an easy thing to ask, but a change made in planning phases is cheap. A change made when the project is in beta is expensive if not impossible. A business type giving a list of changes every day late in the project is infuriating.

* As a corollary, ask to see things early. A prototype with partial functionality is good. It's really true that a lot of times a business type just can't reasonably know what they want until they have something to play with (which is fine), so get that sooner, when things can be cheaply redesigned.

* Understand that changes aren't free. Schedules get made based on original requirements. If the requirements change, it will take longer to finish. If the deadline is fixed, you can't ask for something new without giving something up. Business types that understand this are much, much easier to work with than ones that don't, and are by far the exception.

* Programming requires uninterrupted time, like making a painting or writing an essay. If someone asks you a five minute question, it can take half an hour or more to get back in the flow of what you were doing. Try and send not-urgent questions via email, schedule meetings for the beginning or end of days, and if you can take, "Can I get back to you in 15 minutes?" for an answer, that'd help.

* Understand that these guys can, and do, do the math with respect to compensation. When I had a CEO that asked everyone to put in 60 hour weeks, and then we ended up with 3 figure bonuses when he got a 7 figure bonus at the end of the project - none of us ever respected him. If we're giving more than expected, and we will, show us some of the reward, or watch us leave. Even in a down economy, a competent techie can find greener pastures if it turns into us vs. them.

* Techies are not big on hierarchy and authority. After all, especially in a technology company, we know more about what we do than you do, and if that's the core product, there's often conflict between who is in charge and who knows about what actually is the product that pays the bills. We like to think that we work _with_ the business types, not _for_ the business types. If you project the story of doing some useful work that we don't want to do (like find buyers for our product, get us good press, find investors, stuff we want to happen but don't want to do), it'll go a lot better than if you present yourselves as 'leaders' with 'vision' and expect us to do your bidding. It's a team, and while we get that the buck has to stop somewhere and it's going to be the person with the title in the end, prima-donna VPs are just as well liked by us as prima-donna developers by you.

Someone could write an equivalent set of bullets in the other direction - yes, business people are smart, their jobs are hard, and they're often worth the salary they earn too (and certainly a lot of techies are useless.) But I'm probably not that someone. :) Hope that this is somewhat useful to you, though! I certainly have found that some business people treated technical people well and were good to work with, and others, not so much.

Kirby | 16 years ago | on: Two common mistakes when using databases

For beginners, this is definitely good advice. Particularly the first point - if you're using a relational database, and don't structure your data around its strengths, you'll take a profound performance hit. And sure, don't pass around data structures if you don't know what you're doing, and MySQL is a pretty crummy place to put them.

However, once you're doing real work, sometimes translating to XML and back is extraordinarily expensive.

The best approach is a hybrid. Use the database to store things relationally if possible, and using defined APIs for sure. And if your translation stage is expensive, use something like memcached to make sure you do that translation as infrequently as possible. _This_ is the layer that it's appropriate to store serialized data structures at. It's not permanent storage, you can blow it away when you change the internal structures, and nobody external is relying on it. But you end up, in most programs, with even more speed benefits than if you'd stored it in the database like this to begin with - the initial build can be expensive, but then you're reading basically from memory. (Not all data is well suited to this approach, but if your data is read frequently and written to infrequently - there's few things you can do to increase performance more than this.)

Kirby | 16 years ago | on: Why Pair Programming Is Not For the Masses

I, for one, get higher quality work done when I'm allowed to take breaks. This shop might be heaven for workaholics, but there's a lot of really excellent folks who just wouldn't enjoy their jobs if there was someone in the room constantly pushing them to keep working. It's great their culture works for some, but if this were the norm - I'd quit my job and go back to school in something that didn't involve computers, very quickly.

Or go into management. Do they practice Pair Leadership? Didn't think so.

Kirby | 16 years ago | on: In a recession, is college worth it? Fear of debt changes plans

Well, in a very real way, delaying your entry to the job market by four years right now is a very safe bet. It's brutal out there for people without in-demand skills.

But what this article dances around, but doesn't quite realize, is that college no longer confers in-demand skills a priori. A journalism major that made six figures as a recruiter? That's a combination of personality and luck, and very little element of 'college training'.

If you want to consider college an investment (which is not crazy), you can't just think of it as a checkbox. College, done, now give me a job that pays well! Not realistic. You need an actual skill that people want to pay for. Science and engineering degrees always work well for this. Practical business skills (like accounting), medicine and law, these are the things that pay off.

You know, the ones with _hard_ classes. :)

I don't want to dump on humanities - if you love it, do it. If you're one of the best in your field, and if you're smart and passionate you should end up there, you'll do okay. Some even more than that.

But if you aren't the kind of person whose eyes light up when someone else at a party wants to talk about French Poets, don't study French Poetry and complain about the job market. Passion is fantastic, following it is great - and most people aged 18-22 don't have one. And if you don't, it's idiotic to take classes in a low-job field where you'll get trounced by those that do honestly care about the material above and beyond the grade.

If you aren't following your bliss (and that's okay), go into a hard field that you don't absolutely hate. And you might not love your job until you figure things out, but at least you won't be miserable _and_ broke. But for the love of god, don't major in Philosophy and complain that you didn't get your money's worth. (And I loved my philosophy classes in college.)

Kirby | 16 years ago | on: Diet Soda: The Brain Knows Better

I think this nails it. If you're part of a conscious diet and watching calories, diet soda can be a good choice, because you're _explicitly_ controlling for the effect the study is about - the false sweet causing cravings and ultimately more calorie consumption. It might follow, though, that the diet soda makes the rest of the diet more difficult.

If you're just subbing in diet soda for regular and not paying particular attention to overall diet, though, this study argues that you'll overall lose, as the diet soda increases your cravings for sweet and, left unchecked, you'll gain weight. Seems plausible.

(I don't worry too much personally, since I'm part of the population to which artificial sweeteners are not palatable. I'd rather drink water, and I do quite often.)

Kirby | 16 years ago | on: The Two-Hour Rule

I've gone through periods of life where this was true. There were reasons - truly bad boss, problems in my personal life, etc - but eventually I realized that slacking off had started to become the primary cause of my depression.

So I did what I had to do to actually start getting things done at work. (In my case, I did an end-run around the truly awful boss who was going through a nasty divorce and taking it out on me.) And it didn't take long before I stopped hating life and myself. And I left that job with my head held high. (Because once I was useful, why have a truly awful boss?)

I can't say that I go full steam every hour of the day, but I like to go home at the end of every day knowing I accomplished something real and useful. There are days where the answer is no, but they're the exception. And since I started actually working, I've gotten good, my salary has gone up, and I'm much happier. And I just recently got a great job offer in the middle of a recession!

So, believe in the two-hour rule at your own peril. There's a sweet spot between this, and married to your job. (Workaholics really are a drain on morale too, but that's a different post.)

Kirby | 16 years ago | on: Amazon.com is too powerful

This person is not, and has never been, a software developer.

I take issue with: "Such issues have been largely overlooked in the Amazon.com discussion, which takes for granted that "glitches" are inevitable and self-correcting, that the market will police itself. (We've seen how well that worked on Wall Street.)"

In a very large, very complex application like Amazon.com, glitches _are_ inevitable. They're not self correcting - there are hundreds of highly competent, highly intelligent software engineers that correct them. It's not Wall Street, where people have an incentive to game the system and naturally collude to do so. It's doing something profoundly more complicated than most people really have a conception of, and not being able to execute flawlessly all the time.

Amazon fully deserves the heat for the 1984 debacle, and their communication left a lot to be desired in the politically sensitive, but probably actually a bug, issue that delisted large swaths of gay literature (while embarrassingly leaving anti-gay rhetoric as the primary search results).

But this author seems to think the reaction is to attribute some sort of malice to the entire culture of software engineering, that we accept some level of imperfection and bug fixing in a major project. This is just not a reasonable expectation, and it does nobody any good. Some companies may not spend enough on QA and testing (and I'm sure more than half of you just thought of the same company), but this stuff is hard, and there will be bugs, and we'll try to fix them, and we really don't need some knucklehead from the L.A. Times trying to infer about a process that's too complicated most times for the managers at Amazon itself to really have a handle on.

Kirby | 16 years ago | on: Maker's Schedule, Manager's Schedule

Paul Graham can be hit or miss, IMO, but this one is a direct hit. I don't mind meetings per se - in a good company, they have merit - but their effect on the work day can be disastrous. If you don't work somewhere that understand that programmers need large uninterrupted chunks of time, well, you're not at a developer friendly place. There's few worse problems. And non-makers often don't have any idea that it's not just the standard 'coders are cranky' going on. (We, as a group, _are_ cranky, mind you.)

Kirby | 16 years ago | on: Poll: Do you own a working television?

People like to claim that TV is only full of crap. This is grade A, Microsoft-FUD-level, Bullshit.

Like anything, sure, most of it is crap. Just like if you went to a bookstore and picked up 10 random fantasy novels. You might conclude that they're all just cookie cutter pablum - most are - but then you miss out on George R. R. Martin's Song of Fire and Ice, Guy Gavriel Kay's Tigana, LeGuin's Earthsea, and other truly excellent works of fiction.

But like anything, it takes some investment to tell the good from the bad. If you just watch Jon & Kate Plus 8, Entertainment Tonight, and According to Jim, yeah, you're better off without the TV altogether. But I claim that there's more good TV being made now than at any time in history, particularly for niche markets.

Here's some of my recommendations: Mythbusters, Penn & Teller's Bullshit, Supernatural, Dollhouse, The Office, 30 Rock, Lost, The Simpsons (not as great as it was, but still fun), The Daily Show/Colbert Report, Mad Men... and more, there's just a ton of really high quality work being done.

If your argument is that instead, you watch things online/through netflix, that makes sense. If I didn't enjoy live sports sometimes, I'd consider doing the same.

Not everyone has to have a hobby of watching television, or movies, or reading for that matter, but the opinion that it's all crap is extremely ill-informed and snobbish, and I strongly encourage it to be dropped as something that makes you sound actively stupid when you say it. It's an anti-pattern.

Kirby | 16 years ago | on: Why are there no Googles, Microsofts, Twitters, Facebooks etc from UK?

IMDb is, well, a worldwide collaboration, but if pressed I'd say the driving force was primarily British. (With help from America, Italy, Australia, Germany, and a few other geeks of the day.) I suppose obsessive data gathering is consistent with the stereotypical British mindset.

(It's been long since bought by Amazon, an American company, but a large portion of IMDb employees are still in the UK.)

All things considered, though, my guess is that it has a lot less to do with national character, and a lot more to do with the fact that the Valley is an oddity. It's a great place to find exactly the right mix of people to make a startup work, from potential smart employees to investors to experienced folks who can give key advice. A startup in San Jose has a better chance at making it out alive than one in London. And a fair number of entrepreneurs head to the Valley from around the world for just that reason - the statistics are hard to gather, but if you were to figure out what percentage of startup _founders_ were British, of successful companies, the gap would narrow considerably.

I'm currently at an awesome company in the Washington DC area (thinkgeek.com, a Hacker favorite!) and we're having a real hard time finding competent senior Perl people who are willing to live out here. I'm sure we'd have no problems in Palo Alto. (I lived there for most of the 90s. And would again, happily.)

page 1