Thanks especially for the "Not All Sunshine and Rainbows" section. It is all too easy to write about the positive things and leave the negative parts out.
Resource leaks in Haskell perhaps a bit trickier to track than in other languages and I would appreciate hearing more about the issue you were experiencing and how you solved the problem. In many blog posts there have been warnings against long running Haskell processes but you guys seem to have fairly successful with it.
Also, the problems you were experiencing with Cabal might be fixed in newer versions with the sandboxes feature which is now built-in with later versions of Cabal.
We deal with Haskell resource leaks the same way you would in C++ or Java.
We have production monitors on every host that show basic metrics like memory, disk, and CPU utilization. Atop that, we added a tracker for the number of suspended Haskell threads. (that is, threads which are not blocked on I/O, but are also not running)
We found that the machines are usually able to handle requests as soon as they come in, so if the number of Haskell threads goes above 0 for any length of time, the machine is about an hour away from melting down.
We can restart the process without losing any connections, so this leaves us a very comfortable margin of error.
Once we know we have a problem, it's usually pretty simple to run the heap profiler on the process and look at recent commits. We continuously deploy, so there's only about a 10 minute delay before a particular commit is running in front of customers. This makes tracking regressions down really fast.
Even in cases where we can't figure out why a bit of code is leaking, we can almost always identify it and revert it until we understand what's going on.
I don't believe sandboxes fix the particular problem the OP was describing, which is that of reproducible builds across multiple environments (different team members / CI / prod).
A cabal sandbox means once you've got your dependencies to resolve and your app to build, it'll continue to build and use the same versions of its dependencies, when building from that sandbox (which pretty much means "when building in that working directory"). But it gives you no guarantee that if your dev build got version 0.1.2 of a transitive dependency, then your CI server will also get version 0.1.2, and not 0.1.3.
If it turns out that your app works with 0.1.2 but not with 0.1.3, then your dev machine will reproducibly produce working builds, while your CI server will reproducibly produce broken builds.
What's really needed is an analogue to the Gemfile.lock used by Ruby or npm-shrinkwrap.json in the Node world, which is checked into version control, and freezes the exact versions of all transitive dependencies until explicitly updated. I think there's a "cabal freeze" command in development, but I'm not sure what the status is.
One Haskell resource leak I've encountered a couple of times has to do with opening large numbers of files combined with non-strict semantics. By default Haskell will open IO handles but not consume them until the contents are needed, and thus, not close them. To read the contents of many files in a directory, the result is opening thousands of concurrent file handles and exhausting the OS's IO handle pool. The solution is to add strictness annotations to force evaluation and relinquish the handles, which isn't fun and isn't pretty.
As for resource leaks, the particular example in the post was a bit unfortunate. The problem in general have a solution though. You can either use functions that limit a resource inside a limited scope, and then have it clean up automatically when done (like 'withFile'). And if you want to do more complicated things there are the resourcet and pipes-safe libraries.
Switching a programming language is a major decision for a company. It requires changing the runtime library and possibly re-writing a lot of in-house infrastructure. While personal affinity to a language is important, making a decision for a company might have substantial business consequences.
For a company, overall language adoption, availability of libraries, tools and talent (and I'm not talking about training someone to be productive; sooner or later you need real experts and training an expert is expensive in any language) are extremely important. That's why when choosing between two languages, assuming both are well suited to the job at hand, a company should always pick the one with the wider adoption unless the other is so significantly "better" to trump lesser adoption. And the smaller the adoption, the "better" the language needs to be.
There's no doubt Haskell is an excellent programming language. I'm learning it and enjoying it. But because it has such low adoption rates, it needs to provide benefits that are orders of magnitude higher than other, more popular languages. I guess that for some very particular use cases it does, but I just can't see it in the vast majority of cases.
Hobbyists can afford playing with different languages, and can (and perhaps should) jump from one to the next. Companies can't (or most certainly shouldn't, unless for toy projects); they should pick (wisely) a popular language that's right for them, and just stick with it.
BTW, I think that if a company does want to try a good, though not-so-popular language, it should pick a JVM language, as the interoperability among JVM languages, and the availability of libraries that can be shared among the popular and less-so JVM languages, reduces much of the risk.
> That's why when choosing between two languages, assuming both are well suited to the job at hand, a company should always pick the one with the wider adoption unless the other is so significantly "better" to trump lesser adoption.
I think it plateaus at a certain point. You don't need the largest community, you just need critical mass so you don't have to build everything yourself. In fact, at some point being the biggest ends up diluting the talent pool because of the number of people getting into it for the money, and this is becoming a much bigger problem as the traditional job economy dries up and the demand for programmers increases.
As to being "significantly better", I think Haskell has that in spades. In fact the things that make Haskell better are probably difficult to appreciate by a lot of younger programmers who are using relatively new languages and frameworks like Node and Rails. When you start seeing the effects of code rot and programmer turnover on a codebase over time, the types of static checks that are a couple orders more powerful and simultaneously less verbose and restrictive than what most people think of when they hear "static language" (ie. Java), then Haskell really starts to shine.
Personally I think companies that invest in Haskell are going to start seeing major dividends in terms of productivity and agility over the lifetime of the company.
Hi pron, you're absolutely right. Transitioning to a new language and runtime is a huge undertaking. I work with Andy, the author, and he took exactly the right steps in getting Haskell adopted. Nothing was forced through, and he got both engineering and executive buy-in for the transition.
We do not use Haskell exclusively -- we also have a pile of PHP and some Python -- but it's awesome to have such a great tool at our disposal. I've been very impressed at how mature GHC's runtime system is. The Haskell services have run perfectly for months with no problems.
The Haskell type system even lets us say, for example, that you cannot talk to MySQL within a Redis transaction, preventing entire classes of bugs.
(I have many complaints about Haskell too, but net net, it's an enormous win over PHP and Python.)
I see it the other way around. It's easier to find experts in esoteric languages because you don't have to sift through the same piles of dullards that use main stream languages. A person who has chosen to use a language like Haskell is probably not your average brogrammer.
Also, PHP in particular is so bad, to become an expert in it suggests a willing blindness to its problems, or a general attitude of "live with problems" that converges to a complete shit sandwich over time. When one could have just switched to something else so long ago, PHP is a point against any claim of being a software expert.
It is just like the traffic in DC: whatever your arguments against doing anything about a bad situation because "we don't have the time" or "we don't have the money", it ain't getting any better in the future. If you don't fix it now, it won't ever get fixed.
I about 50% agree with you here. The factor that you're forgetting is the filter effect: Haskell is not adopted, it's learned after a thankless (there are no jobs) process of deep study. This means that the quality of available talent is of a much higher standard. If you adopt a technology with these characteristics, and you're one of the few who are hiring in this pool then you have a significant advantage!
The only risk in my mind is when the big boys figure it out and start gobbling up Haskellers! :-)
BTW, I think that if a company does want to try a good, though not-so-popular language, it should pick a JVM language, as the interoperability among JVM languages, and the availability of libraries that can be shared among the popular and less-so JVM languages, reduces much of the risk.
I used to agree with that, but I think the interoperability of JVM languages is overstated. For instance, using a Scala library from Java or Kotlin is a painful experience, unless its developers invested time to export a Java-friendly API (e.g. to some extend Akka).
The situation for Haskell/C is comparable to that of Scala/Java. It is easy to use Java libraries from Scala, but hard to use Scala libraries from Java. In a similar fashion, it is easy to use C libraries from Haskell, but hard to use Haskell libraries from C.
My experience with large projects is that it is more important to avoid using a large mixture of languages without clearly defined boundaries. E.g. nowadays it's popular to use a mixture of C, C++, Python, Cython, and perhaps even a little Prolog on the side. And while it all links, it is very hard to maintain, debug or for a new developer to get into. Pick one language/ecosystem and stick to it.
This argument is called "Go home and use Java" in latin.
Infact, by this logic it is impossible for any language to be a competitor to java, because it is trumped in community, adoption and programmer replacability.
The diff any language can offer cannot be more than the advantage java has in these three aspects for a guy who wants to use java. Typically, you will find the good java advocate agreeing that the shiny new language is better - but common the net advantage is "trumped" by java's reach.
In case someone has ever convinced the seasoned java advocate, please share your angle of attack and the language of your choice. I don't think there are any.
My experience is that popularity does matter a lot, but that the value of popularity is logistic rather than e.g. linear. Not enough popularity does make using a language in production unrealistic, but there is a threshold of usability after which increased popularity seems to have rapidly diminishing returns. I would argue that Haskell passed this threshold a couple years ago for most use cases. There is still a lot of work to do on cabal and some of the core libraries, but we're happily using Haskell in production, and for us the benefits very clearly outweigh the occasional difficulties.
It's good to hear stories about how FP is being used in industry and I'd like to read more.
For those who have an interest in this, I recommend the Commercial Users of Functional Programming workshop (http://cufp.org). It's where I first heard about Facebook's use of OCaml to create Hack (http://www.youtube.com/watch?v=gKWNjFagR9k)
While visiting your link (http://cufp.org), I saw Chester Bennin of Linkin Park fame there. I didn't know that he's also a programmer! Programming in a functional manner. Another Wow!
It always surprises me to see these imvu.com posts on HN. They are often insightful and interesting, but when you see the website you wonder who pays money for this. Do they sell virtual goods for their own virtual world, similar to secondlife?
I've never heard of imvu before but I don't see how it's any different than micro-transactions within a MMO. If anyone plans to make the argument that MMO's are different because they have "gameplay", then you're probably someone that has never played a very social MMO before.
As an ex-EVE player, I can say with extreme confidence that at least 80% of our alliance ("guild"/social group) was not there to actually play the game. It's pretty common that after a certain amount of game time, you just see the "game" as a glorified chat room. It was an ongoing joke to point out how most of us paid $15 to CCP every month for the ability to talk to each other over XMPP.
They do. I was once an active IMVU user. Outward appearances can be deceiving, but believe me they are profitable. If you read Eric Ries book 'The Lean Startup' you can have a glimpse of IMVU's back story there.
I can't like this post enough. It affirms everything I believe to be true about developing software using modern languages. I wish more developers would invest more time in learning better tools.
It's funny. I want to either give your comment a "+1 Me too!" response, or a huge rant on software quality and the language quagmire we're in with weak (C++/Java) or missing (JS/Python) type systems.
To throw a statement out there without substantiation, and then cowardly running away, I'll say that Haskell feels as easy as Python, where in Python you have to do a little thinking about what types of objects get passed where, often while debugging, in Haskell you have to figure out the types of data/functions while compiling. The result, I feel, is about the same amount of work. The difference is that the Haskell result is substantially, /substantially/ easier to use in larger projects.
Again, that's without substantiation, and with me immediately running away in cowardice!
I guess saying "This." didn't seem verbose enough.
Let me propose an incredibly controversial idea: Haskell isn't a better tool, it's just yet another tool. Which is aggressively promoted not so much with actual benefits, but by trying to shame people who prefer other tools by calling them dullards and telling them that they don't know what they are doing.
The problem I had with Haskell when trying it out was that every time I wanted to use a package with dependencies I ran into "cabal: Error: some packages failed to install. The exception was: ExitFailure 1.". The solution was most often trying earlier versions of the package until it worked or not use the package at all. Cabal just seemed so broken.
Not to diminish the Cabal problems here, but you run into similar problems with Maven as well. Version management is a real hard problem. Especially when there is little awareness in producing compatible successors. Especially this problem made my experiences with Haskell somewhat bitter. But I have the same problem (in really productive code) with Maven. Writing a working Maven plugin can be made pretty challenging by that as especially the stuff you want to use in Maven plugins is burdened with incompatibilities.
It's not really cabal but libraries that are broken. They tend to define dependencies too liberally with >=, which obviously breaks things when the depended library makes non-backwards compatible changes.
Also, the primary language implementation itself makes rather large changes sometimes.
Cabal sandboxes have fixed a lot of these problems, usually this results from having installed all packages globally, and then cabal tries to find a satisfying version between all global packages and fails. You can also take Ed Kmett's approach and just do --force --reinstall every time. I prefer the sandboxes though, when something breaks you can always nuke the sandbox and quickly reinstall. I would recommend upgrading to Cabal 1.18.x and try the sandboxes out.
Cabal has been the focus of a ton of work lately, much of which is still ongoing. It has been improving rapidly, notably with sandboxes in 1.18. It still has a long way to go, admittedly, but I find myself having trouble with it way less frequently now.
So, what's going on here? First you have the declaration of the function - in this case, you have a list of items of type A which can be compared to another one (Ord means "I can use <, <= and so on over this element"), and you'll return a list of type A.
Then you have the base case - if the list is empty, you'll return an empty list.
After that you have the case "item++[list of items]" (note that the second one can be an empty list), where you return [list of smaller items]++item++[list of larger items] (the two lists are defined under the "where" - filter removes elements from a list).
Note however that, instead of "(lesser) ++ item ++ (larger)" you have "(quicksort lesser) ++ item ++ (quicksort larger)" - this will call the algorithm recursively over smaller lists, but I bet you already knew that.
Both pieces of code do the same thing, but I think mine is what you'd expect from a typical piece of Haskell code - the one you have is closer to "see how small I can make this function".
Absolutely not. I'm actually very frustrated that they give that code as an example. First of all, it misses the most important of quicksort - the fact that it can be implemented in place (and yes, you can write an pure in-place quicksort using STArrays).
Second of all, it's bad because it runs through the list xs twice. Instead, they should use the partition function to get both the left-hand and right-hand sides at once.
And thirdly, it's inefficient by use of the (++) operator, which takes time proportional to the length of the left-hand list. A better (out-of-place) implementation of quicksort would use an accumulator so this wouldn't happen (and actually, this means that we should do the partitioning ourselves rather than using the partition function as mentioned above).
Here is an example that makes these improvements with quicksort [1].
It's fairly natural "teaching code". I'd say it's a concise example of one of the ideas behind quicksort, though, as many are quick to point out, it ignores a lot of other very important aspects.
That said, it's not very good practicing Haskell code. We can look at the Data.List module from the base package to see some optimized list operations with a well-designed module API [0] though some of the optimization methods will be non-obvious and kludgy looking. Data.Complex shows some fairly natural data structure work [1] if you factor out the CPP headers which are there so it'll compile on both GHC and Hugs while using all of the newest GHC features.
List comprehensions actually don't tend to get used very frequently, so I would not point to that as a good example of what Haskell code looks like. It's hard to point to one example, as different parts of the code look very different. That is, pure functions look very different from monadic functions, which both look very different from datatype declarations and typeclass instances. I would recommend poking around a whole codebase, one example of which can be found here: https://github.com/xmonad/xmonad
It's a good example of how quickly you can express an algorithm, but in terms or readability, some additional whitespace, and possibly some better naming would make it nicer.
Once you understand what bits like "p:xs", "++" and "[x|y,z]" mean though, it will make much more sense.
I'd say it's good in the sense that it is readable provided you are familiar with the syntax for list comprehensions and appendings, and obviously correct if you know quicksort.
i'd like to hear more about employers' perspectives on hiring for "an uncommon language without a comparatively sparse industrial track record." there was some debate about it in the clojure 1.6 release announcement. the author of the article didn't find it to be a problem, but then he was apparently only hiring one person.
When I left my last Haskell job, my employer put out ads to the Haskell community and quickly got a pool of 5-10 solid applications. A good number of those were from people outside the U.S. which was a no-go for them. After waiting a few weeks they decided to open it up and not require Haskell experience. I looked through a number of these resumes and there was a massive drop in quality. It was like night and day. None of those applicants were even remotely interesting IMO. They ended up waiting a few more months and finally found a Haskell applicant in the U.S. that they were happy with. If you're willing to accept people working remotely your chances of finding someone are much better.
Not all uncommon languages are made equal. It's very different to hire someone to work on Scala than it is to hire someone to work in PICK.
I believe than when push comes to shove, the language your team uses is not the primary determination of the quality of your software. I've actually seen good programs in PICK, an ancient abomination that nobody should be using today. Your major difference is people. Good developers with a good direction will give you good software. Bad developers with the same toolset will give you bad software. So, if anything, your language selection is important because it gives you a very different talent pool.
Hiring a Scala developer, for instance, isn't all that easy, but it's definitely not impossible. For the extra pain you pay in finding someone that will use Scala instead of Java, you get, in exchange, a major filter on your applicants, which tends to be a good thing.
With an antique like PICK, the filter works against you, because what you get is the very old that have not moved on, and the desperate.
I do not think that getting more and more exclusive gets you a better filter, even when you are looking for relatively hip, uncommon languages. Is your Clojure developer naturally better than your Scala developer? How about Haskell? It's not really an issue of every filtering picking the best, but picking people that care about their tools.
So if someone told me that they are hiring developers, and that they completely lack a network to pull them from, it'll be easier to get a good core of developers in one of those uncommon languages than if you just place an ad for people that have 10 years of Java experience. This situation is uncommon though: If you are hiring, you typically have a network. Then what you do is to target some of the best developers you can get a hold of, and hire around the technologies they want to use. You'd be surprised by how much money a competitor has to pay a developer to poach them out of you if they really like your current toolset, but aren't really all that fond of the one the competitor uses.
We decided on Scala because our seed team wanted to use it, and just went out and hired locals that had similar technology interest. We also brought in a few major independent consultants that made sure our core team really understood the language, instead of just staring with good developers with little experience in their toolset. Overall, our experience was very positive, and we have a happy Scala crew.
I've had experience on both sides, plus know some other people who have been recently hiring haskellers. I have a huge problem hiring haskell people because the company I work for kinda sucks. I don't want to work here, so I can't really be surprised that nobody else does either. Haskell users seem to be more picky about where they work and what they work on, so if you've got micromanagement and "do this wrong because I said so" going on, you'll have a hard time. Where as you can still easily get hordes of PHP programmers willing to tolerate anything.
On the other hand, someone hiring remote haskell programmers for a company where the culture values quality had 100+ applications for half a dozen openings and really has their pick of a lot of top notch people. I think if you can offer a good job, finding haskell people isn't a problem. But if you can't, then I would stick to popular_language_x.
I'm curious about the claim that it only takes a few days to get people productive working in Haskell. I like the language a lot, but learning it has definitely been slow compared to any other language I've learned. It just seems like there are a lot of changes to how you think.
Is this a feature of the way your code is structured that simplifies what people have to understand about the language before they can contribute?
Haskell is a terrible choice for a production system.
I know well over a dozen programming languages. I have a stronger theoretical background in mathematics, computer science, and compiler design than most web developers. But Haskell has always been utterly mystifying to me.
I've attempted to learn Haskell several times, and I've had monads explained to me several times right here on HN -- but I simply don't seem to be able to wrap my mind around the concept.
To be sure, if I had a year or so to devote to studying the theory behind Haskell's type system, I might indeed become more productive writing web applications in Haskell. And I don't doubt that there are certain problems in logic programming or language theory for which Haskell provides useful concepts and powerful tools, to the point where the solutions to important problems may have trivial implementations.
I've always thought that a major problem with Haskell for web development is: (A) You'll have difficulty finding people who both know Haskell and are interested in web applications, and (B) you'll have difficulty training web developers in Haskell. The author of the article apparently has a training process that gets around (B), and I'd kind of like to know what it is -- a way to learn Haskell without a ton of study.
[+] [-] exDM69|12 years ago|reply
Resource leaks in Haskell perhaps a bit trickier to track than in other languages and I would appreciate hearing more about the issue you were experiencing and how you solved the problem. In many blog posts there have been warnings against long running Haskell processes but you guys seem to have fairly successful with it.
Also, the problems you were experiencing with Cabal might be fixed in newer versions with the sandboxes feature which is now built-in with later versions of Cabal.
[+] [-] implicit|12 years ago|reply
We have production monitors on every host that show basic metrics like memory, disk, and CPU utilization. Atop that, we added a tracker for the number of suspended Haskell threads. (that is, threads which are not blocked on I/O, but are also not running)
We found that the machines are usually able to handle requests as soon as they come in, so if the number of Haskell threads goes above 0 for any length of time, the machine is about an hour away from melting down.
We can restart the process without losing any connections, so this leaves us a very comfortable margin of error.
Once we know we have a problem, it's usually pretty simple to run the heap profiler on the process and look at recent commits. We continuously deploy, so there's only about a 10 minute delay before a particular commit is running in front of customers. This makes tracking regressions down really fast.
Even in cases where we can't figure out why a bit of code is leaking, we can almost always identify it and revert it until we understand what's going on.
[+] [-] samstokes|12 years ago|reply
A cabal sandbox means once you've got your dependencies to resolve and your app to build, it'll continue to build and use the same versions of its dependencies, when building from that sandbox (which pretty much means "when building in that working directory"). But it gives you no guarantee that if your dev build got version 0.1.2 of a transitive dependency, then your CI server will also get version 0.1.2, and not 0.1.3.
If it turns out that your app works with 0.1.2 but not with 0.1.3, then your dev machine will reproducibly produce working builds, while your CI server will reproducibly produce broken builds.
What's really needed is an analogue to the Gemfile.lock used by Ruby or npm-shrinkwrap.json in the Node world, which is checked into version control, and freezes the exact versions of all transitive dependencies until explicitly updated. I think there's a "cabal freeze" command in development, but I'm not sure what the status is.
[+] [-] whatgoodisaroad|12 years ago|reply
[+] [-] klrr|12 years ago|reply
[+] [-] pron|12 years ago|reply
For a company, overall language adoption, availability of libraries, tools and talent (and I'm not talking about training someone to be productive; sooner or later you need real experts and training an expert is expensive in any language) are extremely important. That's why when choosing between two languages, assuming both are well suited to the job at hand, a company should always pick the one with the wider adoption unless the other is so significantly "better" to trump lesser adoption. And the smaller the adoption, the "better" the language needs to be.
There's no doubt Haskell is an excellent programming language. I'm learning it and enjoying it. But because it has such low adoption rates, it needs to provide benefits that are orders of magnitude higher than other, more popular languages. I guess that for some very particular use cases it does, but I just can't see it in the vast majority of cases.
Hobbyists can afford playing with different languages, and can (and perhaps should) jump from one to the next. Companies can't (or most certainly shouldn't, unless for toy projects); they should pick (wisely) a popular language that's right for them, and just stick with it.
BTW, I think that if a company does want to try a good, though not-so-popular language, it should pick a JVM language, as the interoperability among JVM languages, and the availability of libraries that can be shared among the popular and less-so JVM languages, reduces much of the risk.
[+] [-] dasil003|12 years ago|reply
I think it plateaus at a certain point. You don't need the largest community, you just need critical mass so you don't have to build everything yourself. In fact, at some point being the biggest ends up diluting the talent pool because of the number of people getting into it for the money, and this is becoming a much bigger problem as the traditional job economy dries up and the demand for programmers increases.
As to being "significantly better", I think Haskell has that in spades. In fact the things that make Haskell better are probably difficult to appreciate by a lot of younger programmers who are using relatively new languages and frameworks like Node and Rails. When you start seeing the effects of code rot and programmer turnover on a codebase over time, the types of static checks that are a couple orders more powerful and simultaneously less verbose and restrictive than what most people think of when they hear "static language" (ie. Java), then Haskell really starts to shine.
Personally I think companies that invest in Haskell are going to start seeing major dividends in terms of productivity and agility over the lifetime of the company.
[+] [-] chadaustin|12 years ago|reply
We do not use Haskell exclusively -- we also have a pile of PHP and some Python -- but it's awesome to have such a great tool at our disposal. I've been very impressed at how mature GHC's runtime system is. The Haskell services have run perfectly for months with no problems.
The Haskell type system even lets us say, for example, that you cannot talk to MySQL within a Redis transaction, preventing entire classes of bugs.
(I have many complaints about Haskell too, but net net, it's an enormous win over PHP and Python.)
[+] [-] moron4hire|12 years ago|reply
Also, PHP in particular is so bad, to become an expert in it suggests a willing blindness to its problems, or a general attitude of "live with problems" that converges to a complete shit sandwich over time. When one could have just switched to something else so long ago, PHP is a point against any claim of being a software expert.
It is just like the traffic in DC: whatever your arguments against doing anything about a bad situation because "we don't have the time" or "we don't have the money", it ain't getting any better in the future. If you don't fix it now, it won't ever get fixed.
[+] [-] boothead|12 years ago|reply
The only risk in my mind is when the big boys figure it out and start gobbling up Haskellers! :-)
[+] [-] microtonal|12 years ago|reply
I used to agree with that, but I think the interoperability of JVM languages is overstated. For instance, using a Scala library from Java or Kotlin is a painful experience, unless its developers invested time to export a Java-friendly API (e.g. to some extend Akka).
The situation for Haskell/C is comparable to that of Scala/Java. It is easy to use Java libraries from Scala, but hard to use Scala libraries from Java. In a similar fashion, it is easy to use C libraries from Haskell, but hard to use Haskell libraries from C.
My experience with large projects is that it is more important to avoid using a large mixture of languages without clearly defined boundaries. E.g. nowadays it's popular to use a mixture of C, C++, Python, Cython, and perhaps even a little Prolog on the side. And while it all links, it is very hard to maintain, debug or for a new developer to get into. Pick one language/ecosystem and stick to it.
[+] [-] new_guy123|12 years ago|reply
Infact, by this logic it is impossible for any language to be a competitor to java, because it is trumped in community, adoption and programmer replacability.
The diff any language can offer cannot be more than the advantage java has in these three aspects for a guy who wants to use java. Typically, you will find the good java advocate agreeing that the shiny new language is better - but common the net advantage is "trumped" by java's reach.
In case someone has ever convinced the seasoned java advocate, please share your angle of attack and the language of your choice. I don't think there are any.
[+] [-] nbouscal|12 years ago|reply
[+] [-] amirmc|12 years ago|reply
For those who have an interest in this, I recommend the Commercial Users of Functional Programming workshop (http://cufp.org). It's where I first heard about Facebook's use of OCaml to create Hack (http://www.youtube.com/watch?v=gKWNjFagR9k)
[+] [-] teemo_cute|12 years ago|reply
[+] [-] Pxtl|12 years ago|reply
Seriously, this is like finding out that McDonalds' restaurants are powered by nuclear fusion.
Although this does make me very curious about learning Haskell - I've been pretty stagnant about learning new languages.
[+] [-] judk|12 years ago|reply
[+] [-] mtford|12 years ago|reply
[+] [-] cordite|12 years ago|reply
[+] [-] kristianp|12 years ago|reply
[+] [-] doesnt_know|12 years ago|reply
Clothing, Furniture, Rooms, Avatars.
I've never heard of imvu before but I don't see how it's any different than micro-transactions within a MMO. If anyone plans to make the argument that MMO's are different because they have "gameplay", then you're probably someone that has never played a very social MMO before.
As an ex-EVE player, I can say with extreme confidence that at least 80% of our alliance ("guild"/social group) was not there to actually play the game. It's pretty common that after a certain amount of game time, you just see the "game" as a glorified chat room. It was an ongoing joke to point out how most of us paid $15 to CCP every month for the ability to talk to each other over XMPP.
[+] [-] teemo_cute|12 years ago|reply
[+] [-] noelwelsh|12 years ago|reply
[+] [-] lallysingh|12 years ago|reply
To throw a statement out there without substantiation, and then cowardly running away, I'll say that Haskell feels as easy as Python, where in Python you have to do a little thinking about what types of objects get passed where, often while debugging, in Haskell you have to figure out the types of data/functions while compiling. The result, I feel, is about the same amount of work. The difference is that the Haskell result is substantially, /substantially/ easier to use in larger projects.
Again, that's without substantiation, and with me immediately running away in cowardice!
[+] [-] collyw|12 years ago|reply
[+] [-] pekk|12 years ago|reply
Let me propose an incredibly controversial idea: Haskell isn't a better tool, it's just yet another tool. Which is aggressively promoted not so much with actual benefits, but by trying to shame people who prefer other tools by calling them dullards and telling them that they don't know what they are doing.
[+] [-] Kiro|12 years ago|reply
[+] [-] chopin|12 years ago|reply
So, Cabal is in good company in that regard.
[+] [-] dscrd|12 years ago|reply
Also, the primary language implementation itself makes rather large changes sometimes.
[+] [-] jroesch|12 years ago|reply
[+] [-] soofaloofa|12 years ago|reply
[+] [-] nbouscal|12 years ago|reply
[+] [-] codygman|12 years ago|reply
--max-backjumps 10000
Also the latest HEAD of cabal has the freeze command which is remniscient of pip freeze.
[+] [-] creichert07|12 years ago|reply
[+] [-] iaskwhy|12 years ago|reply
qsort (p:xs) = qsort [x | x<-xs, x<p] ++ [p] ++ qsort [x | x<-xs, x>=p]
Is this a good example of good Haskell code?
[1] http://www.haskell.org/haskellwiki/Introduction#Brevity
[+] [-] probably_wrong|12 years ago|reply
Then you have the base case - if the list is empty, you'll return an empty list.
After that you have the case "item++[list of items]" (note that the second one can be an empty list), where you return [list of smaller items]++item++[list of larger items] (the two lists are defined under the "where" - filter removes elements from a list).
Note however that, instead of "(lesser) ++ item ++ (larger)" you have "(quicksort lesser) ++ item ++ (quicksort larger)" - this will call the algorithm recursively over smaller lists, but I bet you already knew that.
Both pieces of code do the same thing, but I think mine is what you'd expect from a typical piece of Haskell code - the one you have is closer to "see how small I can make this function".
[+] [-] thetwiceler|12 years ago|reply
Second of all, it's bad because it runs through the list xs twice. Instead, they should use the partition function to get both the left-hand and right-hand sides at once.
And thirdly, it's inefficient by use of the (++) operator, which takes time proportional to the length of the left-hand list. A better (out-of-place) implementation of quicksort would use an accumulator so this wouldn't happen (and actually, this means that we should do the partitioning ourselves rather than using the partition function as mentioned above).
Here is an example that makes these improvements with quicksort [1].
[1] http://en.literateprograms.org/Quicksort_(Haskell)#Using_an_...
[+] [-] emillon|12 years ago|reply
I don't think that list comprehensions are very idiomatic in Haskell, one would probably use Data.List.partition instead:
[+] [-] tel|12 years ago|reply
That said, it's not very good practicing Haskell code. We can look at the Data.List module from the base package to see some optimized list operations with a well-designed module API [0] though some of the optimization methods will be non-obvious and kludgy looking. Data.Complex shows some fairly natural data structure work [1] if you factor out the CPP headers which are there so it'll compile on both GHC and Hugs while using all of the newest GHC features.
[0] http://hackage.haskell.org/package/base-4.6.0.1/docs/src/Dat...
[1] http://hackage.haskell.org/package/base-4.6.0.1/docs/src/Dat...
[+] [-] nbouscal|12 years ago|reply
This post may also be useful: http://blog.ezyang.com/2011/11/how-to-read-haskell/
[+] [-] danpalmer|12 years ago|reply
Once you understand what bits like "p:xs", "++" and "[x|y,z]" mean though, it will make much more sense.
[+] [-] th0114nd|12 years ago|reply
[+] [-] marcosdumay|12 years ago|reply
It does give you a broad idea, but does not in any way show what a complex program looks like.
[+] [-] auvrw|12 years ago|reply
good post, Tikhon!
[+] [-] mightybyte|12 years ago|reply
[+] [-] hibikir|12 years ago|reply
I believe than when push comes to shove, the language your team uses is not the primary determination of the quality of your software. I've actually seen good programs in PICK, an ancient abomination that nobody should be using today. Your major difference is people. Good developers with a good direction will give you good software. Bad developers with the same toolset will give you bad software. So, if anything, your language selection is important because it gives you a very different talent pool.
Hiring a Scala developer, for instance, isn't all that easy, but it's definitely not impossible. For the extra pain you pay in finding someone that will use Scala instead of Java, you get, in exchange, a major filter on your applicants, which tends to be a good thing.
With an antique like PICK, the filter works against you, because what you get is the very old that have not moved on, and the desperate.
I do not think that getting more and more exclusive gets you a better filter, even when you are looking for relatively hip, uncommon languages. Is your Clojure developer naturally better than your Scala developer? How about Haskell? It's not really an issue of every filtering picking the best, but picking people that care about their tools.
So if someone told me that they are hiring developers, and that they completely lack a network to pull them from, it'll be easier to get a good core of developers in one of those uncommon languages than if you just place an ad for people that have 10 years of Java experience. This situation is uncommon though: If you are hiring, you typically have a network. Then what you do is to target some of the best developers you can get a hold of, and hire around the technologies they want to use. You'd be surprised by how much money a competitor has to pay a developer to poach them out of you if they really like your current toolset, but aren't really all that fond of the one the competitor uses.
We decided on Scala because our seed team wanted to use it, and just went out and hired locals that had similar technology interest. We also brought in a few major independent consultants that made sure our core team really understood the language, instead of just staring with good developers with little experience in their toolset. Overall, our experience was very positive, and we have a happy Scala crew.
Of course, YMMV.
[+] [-] copergi|12 years ago|reply
On the other hand, someone hiring remote haskell programmers for a company where the culture values quality had 100+ applications for half a dozen openings and really has their pick of a lot of top notch people. I think if you can offer a good job, finding haskell people isn't a problem. But if you can't, then I would stick to popular_language_x.
[+] [-] hyperpape|12 years ago|reply
Is this a feature of the way your code is structured that simplifies what people have to understand about the language before they can contribute?
[+] [-] AnimalMuppet|12 years ago|reply
What's the definition of "systems language"? If PHP could even remotely be considered an option, the definition is clearly not what I think it is...
[+] [-] csense|12 years ago|reply
I know well over a dozen programming languages. I have a stronger theoretical background in mathematics, computer science, and compiler design than most web developers. But Haskell has always been utterly mystifying to me.
I've attempted to learn Haskell several times, and I've had monads explained to me several times right here on HN -- but I simply don't seem to be able to wrap my mind around the concept.
To be sure, if I had a year or so to devote to studying the theory behind Haskell's type system, I might indeed become more productive writing web applications in Haskell. And I don't doubt that there are certain problems in logic programming or language theory for which Haskell provides useful concepts and powerful tools, to the point where the solutions to important problems may have trivial implementations.
I've always thought that a major problem with Haskell for web development is: (A) You'll have difficulty finding people who both know Haskell and are interested in web applications, and (B) you'll have difficulty training web developers in Haskell. The author of the article apparently has a training process that gets around (B), and I'd kind of like to know what it is -- a way to learn Haskell without a ton of study.