It never gained widespread adoption because it was off by default. Rails 4 turns it on by default on multithreaded web servers like Puma.
And even in Rails 4, concurrent request handling is multiplexed on a single CPU core, leaving Rails unable to take advantage of the parallelism of modern architectures.
This isn't true on JRuby, where Ruby threads are 1:1 with JVM threads (which are in turn 1:1 with native threads) that execute in parallel on multiple cores.
At my employer (Square) we run many of our Rails apps in thread safe mode on top of JRuby, providing parallel request processing across multiple cores on a single JRuby/JVM instance.
JRuby is of course incompatible with C extensions to Rails; the place I used Rails had such dependencies, and so JRuby was not an option. I agree that JRuby is otherwise preferable to cRuby, though.
I'd be curious to hear how Square has found ruby (independent of rails) to be from a maintenance standpoint.
> And even in Rails 4, concurrent request handling is multiplexed on a single CPU
Actually, even on YARV you will use more than one CPU, however the GVL won't allow them to be used efficiently.
My point is: there is nothing in Rails forcing it being multiplexed into a single CPU, it is all about which language implementation and web server you choose.
The post is also very inconsistent on its own arguments. If multi-core and being able to share memory in between cores is a criteria, Node is also not a good option compared to what you have in the JVM, Erlang VM, Haskell, Go, etc.
How is the state of the thread-safety among gems? I have thought a lot about it, but the danger of having thread-issues in the included gems are a bit daunting to be honest.
Having spent a lot of time in both Rails and Node, the author's unequivocal endorsement of Node as better than Rails is just silly.
Building a non-trivial app on Node requires that you do an awful lot of plumbing. Yes, the performance is very nice. And it's going to be great when the community grows up a bit and things start to Just Work with each other. But that day is not here yet.
A lot of the components that people depend on are still really buggy. In the last three days I've had to debug and fix four bugs in other people's npm modules. The community's attention is spread all over the place, because there isn't one dominant way of doing things.
People can't even agree on sane interoperable packaging for code that needs to work on both browsers and in node. It's a mess.
I don't see an unequivocal endorsement of Node in this rant. He only said that Node is better by his criteria if you really want to use a dynamic language.
I don't like Rails, and I don't like Ruby, but this wasn't a particularly enlightening piece. He doesn't explain why using something like Django in place of Rails makes sense; he handwaves a bit over Node. There's some statically typed language dogma, and that's it.
I'd love to read a good hit piece on Rails, written by someone who spends the time to lay out a strong technical justification for their opinion. This was disappointingly Twitterish.
Couldn't agree more with this sentiment (though I do like ruby as a language quite a bit). I have used other frameworks: sinatra in ruby, ring/compojure in clojure, flask in python, and there are a myriad of reasons to choose one of these over something like rails, but the author discusses none of this. I think much of the problem with rails is that it's so monolithic and has become rather bloated in certain respects.
I would very much appreciate a more thorough, thoughtful discussion of the pros and cons of various design decisions in rails. It does a lot of things right which is why you see people modeling various tools after rails (database migrations for example... I recently wrote a clojure plugin that closely emulates the simplicity of making schema changes ( https://github.com/ckuttruff/clj-sql-up )).
I work with rails every day and lots frustrates me about it, but it's also quite effective for a lot of things (which may be why "everyone and their dog" seems to use it). If you really want to have any influence over this fact, it would help tremendously to create a more compelling argument; I'm sure this would inspire much more interesting dialogue
I've worked with rails a lot over the years, including high traffic sites, and the general comment I'd make is that Rails scales more or less like PHP, but consumes far more ram per process doing so. Beyond that, whether it's good for your project probably depends a lot more on culture and the people involved than any technical distinction. Deploying rails was historically rather clunky, but that's been resolved over the last couple years thanks to folks like Heroku.
This piece highlights the inherent drawback of all dynamic languages: performance. We've been through the performance argument too many times to count with both Rails and PHP.
I don't know what kind of taste I would have if I were to follow this piece as advice. Am I expected to do CGI in straight C? Ridiculous.
For a midsized to big project static typing is really nice. eg A few years ago I had a perl project that only let you know a function was missing at run time (ie when the user asked for that code).
Why do I care if my simple CRUD app can only use a single CPU core? Because someday I might have millions of users and then I'll have Twitter's old problems? I'd love to have to Twitter's problems.
Use the right tool for the job. And claiming that Rails is never the right tool is just silly.
I submitted this because I thought it might make for some good discussion, but:
> This post is my attempt to be fair, objective, and, by consequence, unrelentingly negative about Rails :)
To me it seems arrogant to assume that you are being fair and objective when you only point out the negative attributes of something. I'd be more inclined to assume that there are positives I'm overlooking, that might even outweigh the negatives. Especially for something as beloved by developers as Rails.
What evidence is there that running a process per core has a significant negative effect on throughput and/or latency in practice? Benchmarks? Data from developers who tried both approaches while holding all else constant? (The latter is probably quite difficult in practice.) Also, JRuby supports real threads without a global interpreter lock.
I think I agree with this post about the benefits of static typing, though.
I agree that the post would be more compelling if I wrote a benchmark to demonstrate how much faster an in-memory cache is than an off-process or off-machine cache.
From first principles, though, I believe it should be obvious (yes?) that the ~300 nanoseconds it takes to grab a read lock and read from main memory is going to beat the ~1000000 nanoseconds it takes to get a response back from a remote cache over the network. Inasmuch as an application blocks on such cache reads, these sorts of things add up to troublesome latency numbers (and Rails – or at least dallistore – does indeed block on reads like these).
JRuby was off the table for the place I used rails due to reliance on some C extensions.
And I'm sorry if the argument seemed arrogant: I was being tongue-in-cheek about the "unrelenting negativity" part. My point about objectivity was that I tried not to rely on my opinions as much as demonstrable statements. That said, I didn't take the time to actually demonstrate most of those, and for that, shame on me.
Objectivity is a separate thing from the topic of the post. He came into it wanting to write an explanation of why he thinks you should never use Rails, and attempted to be objective about it by including data and describing some caveats to his argument (i.e. scenarios where you might be able to get away with it).
A post that says 'why you should or shouldn't use rails' is a different post and no more or less implicitly objective. Objectivity does not mean 'covering two sides of an argument'; sometimes one side is less or more supported than the other and sometimes you simply don't have the knowledge to cover both sides fairly.
> And it’s no wonder. Until Rails 4, the core framework didn’t even support concurrent request handling. And even in Rails 4, concurrent request handling is multiplexed on a single CPU core, leaving Rails unable to take advantage of the parallelism of modern architectures.
Every single rails installation I have seen runs multi-process and takes advantage of multiple cores. And then he goes on to say something similar, but says thats a disadvantage because the processes aren't using shared memory to communicate and have to use memcached.
The thing is, the rails model means rolling restarts are a lot easier, and you are lot more flexible with deployment strategies.
> However, modern dynamic languages (and their incapacity to do the sort of meaningful pre-runtime verification of basic semantic well-being one expects from a compiler) place a burden on test coverage. Of course it is essential in any language to provide test coverage for core algorithms and other subtle aspects of a software module. However, when trying to “move fast" and get to market quickly, one shouldn’t have to write tests for every souped-up accessor method or trivial transformation.
I'm still undecided about this angle. I simply don't find types to be a problem. While I agree fixed types are more performant, not dealing with types makes the code more fun because you're not held back from running it because you forgot a typecast.
Tests are essential, but the flipside is that tests are quicker to write in a duck-typed language, too.
The post of obviously trolling for clicks - even the title is saying nobody should use rails, and then it gives his times when you should use it at the bottom.
I don't think anyone is arguing that companies fail to grow because of these languages. It's merely that they would grow more quickly once at scale if they didn't have to spend several years rearchitecting while adding few innovations in the meantime (this is what happened at Twitter, for example).
Rails is a tradeoff, quick dev time, slow execution.
It's not designed for low-profit per request applications. Recent benchmarks suggest about 400 req/sec on a $10 digiocean server suggesting if you max your machine 24/7 that you'd need to make at least 1 cent per million requests, if your profit margin is below this then rails would not be a good solution.
There's no way you could serve 400 req/s with Rails on a $10 server for any sort of complex web app that requires data lookups and html rendering, partials, etc.
Maybe possible if you cache everything, but that can be quite complex.
I hate rails and everything it stands for, but it's often the right choice for a business. The choice to develop faster in exchange for a heavier maintenance burden later on (which is what a dynamic language gives you) is often a good one. I don't see the concurrent request handling of node or django outweighing greater library availability and ease of hiring (often the hardest part of any business). True, any great engineer can learn a new language - but it takes time, time you may not have. (To say nothing of the inconvenient fact that many businesses can get by perfectly adequately with mediocre programmers).
I have my fair share of issues with Rails, but I think this is a bit extreme.
When I look into other ecosystems, the amount of library support for building modern web applications just doesn't compare. I think it's this aspect of Rails that really helps get from zero to viable product so quickly.
That said, the performance issues do start to take a toll after a while. And forget it if you want to do something counter to the way Rails want you to do it.
For reasons like this, and I would add, scary security history, I decided against using Rails.
I've experimented with Django and lightweight Python frameworks, Node.js and Java 6 EE, among others.
What has worked best for me was C# + ASP.NET MVC. The C# language has several features that I find appealing and lead to clean, efficient code (such as dynamic, lambda, LINQ and async, among others). The modern incarnation of ASP.NET running under IIS is quite efficient and productive both in development, profiling & diagnostics, and in production.
I can't imagine why a startup would lock themselves into Microsoft's ecosystem knowing that the more you grow the more those licenses are going to bite.
I've done some web dev with node, python (GAE) and php. I ended up using C# + mono's web server for a recent project and was pleasantly surprised.
ASP.net is showing its age but there are some solid libraries, reasonably comprehensive documentation, and some nice new frameworks like MVC out there to use (I didn't even use a framework, just wrote 500 lines of wrapper logic so I could expose some REST services that handled JSON input/output and then built a connection pool for my Redis connections). If you know C# (or another .NET language) well enough, you can lean heavily on the type system and write automated tests for everything else and get all your errors/red tests displayed to you in your IDE. I hit very few runtime errors while building the services.
It's also pleasantly surprising how easy ASP.net deploys seem to be: rsync your app folder to the web server; the app.config and bin/ folders inside ensure that all your dependencies and configuration move over to the target machine. Mono's ASP.net server seems to support almost everything Microsoft's does, so I think I only ran into one behavioral difference (and it was documented, albeit a little hard to find).
ASP.net also has some of those same code reuse benefits you get with node.js, just in a different direction. Now if you have any native C# applications or libraries for doing interesting things, you can expose them as a web service trivially. JSIL's online sandbox (http://jsil.org/try) is literally just the compiler libraries deployed to a linux box with a 100-line shim over them that does compilation and caching and error reporting.
I've actually been quite impressed with C# and .net as well, I can't always afford to use it, but it wasn't at all unpleasant to develop an app in.
The thing is, by the criteria and benchmarks the author of this post is using, asp.net MVC would be relegated to the dust bin for the same reason Rails was; it's at the bottom of his performance graph.
This is nothing but link baiting. At the end of the post he adds the qualifier.
>If you already know how to get things done in Rails, you’re in a hurry, you don’t need to maintain what you’re building, and performance is not a concern, it might be a good choice.
This is exactly why Rails will remain as the goto framework for many developers.
Time to market and developer productivity matter so much more than capital expenses that it is ridiculous. I could buy another server in the time it takes to pay a developer for a days work. Not to mention, not all the features on a web application need to serve more than 2500 requests per second. The ones that do can be refactored into a web service with higher throughput or designed to be scaled separately from the rest of the features. It doesn't make sense to daisy-tank (to pick daisies with a tank) every single feature to support throughput it doesn't need at the expense of developer time. More over until there are analytics for what your users are using deciding what to optimize is entirely speculation. Bad science. The best way to get those analytics is to be live and the best way to be live is to have a built product.
The benchmarks of linux on a physical machine rather than a virtual machine is not representative of performance for (gawd help me) cloud-centric deployments. These benchmarks assume the maximum benefit from compiler optimizations that would not be available on a virtualized machine.
I'm not sure what compilers check more than syntactical errors. Those aren't any slower to fix in a dynamic language. I'd recommend checking on Sandi Metz on testing
The rest is just language preference. And my preference is ruby. Python is fine with me too. Javascript makes me a sad panda, but it runs in all the browsers and it's a lot better with the magic of coffeescript.
Recently I spent around an hour writing a small image resizing server in Ruby/Rails and Go (a fixed amount of time in each). I have several years of Rails experience and a year of Go experience. The Go server code was significantly more verbose, had far less features and worse error handling given the amount of time I had. I realize it is not the best comparison given that I am more experienced with Rails - I believe though that it would not be that much different even given a few years more Go experience. Yes the speed of Ruby and Rails can be frustrating, but I find it is often not needed, and if it is you can just rethink that piece or do it in a language that is more suited to the task. For development productivity I have yet to find a combination of language/framework that let me write maintainable code in as short a time (having worked over the years with frameworks in C/PHP/.NET/Java etc.).
It's worse than that. He links to a set of benchmarks to demonstrate that rails is slow without even considering the caching utilities that rails provides.
Who bases their sole criterion for a web framework on speed, and then ignores the baked-in caching functionality that a framework provides?
Rails is pretty awesome for building monolithic, CMS systems and other CRUD type apps.
It's not awesome for a lot of other things.
I don't really get the point of this article and the author seems to be contradicting himself all over the place. Like, how can you possibly talk about the pitfalls of dynamic typing and then a paragraph later tell people to use JavaScript or Python? WTF?
Most problems at scale have to do with I/O, which Go isn't going to help you with. Your data stores are gonna be your pain points.
And oh boy, talk about a mentality of premature optimization... seriously folks, worry more about making something that someone is gonna want to use than how many requests per second the thing can serve. Right now your product has zero requests per second so you could just fulfill them by hand if you had to.
FWIW, my biggest issue with Rails (and much of what's produced by the Ruby community) is magic: I find the amount of stuff that's implicit, either depending on naming conventions, or automatic inclusion, or overriding default language behavior, or whatever, to be absolutely maddening. And as magic, it's mostly ungreppable and ungoogleable. Trying to figure out what exactly a given line of code does can take way too long. If I was writing an app from scratch as the only developer, I suppose this would be ok, but trying to maintain an existing app, or working as a team, this is a major problem.
The (lack of) speed in my local development environment has been pretty annoying as well.
"If you already know how to get things done in Rails, you’re in a hurry, you don’t need to maintain what you’re building, and performance is not a concern, it might be a good choice. Otherwise, never."
Take the first line and the last line and it sounds like Rails works perfect in an intra-nettish application as a front end to lots of data in a database. I have a lot of experience with that. It does work really well... for awhile.
Needless to say only having a couple hundred theoretical possible users means that worrying about handling 400 reqs/second doesn't come up as an issue very much.
His line about rails maintenance is dead on and the source of much internal push to run (not walk) away from rails. Push something out to users on rails 1.1 or whatever from 2007 and it won't run in 2013. A rails app needs constant continuous rewriting just so an apt-get upgrade won't kill it. Can't just deploy and walk away like a perl CGI script. Even if absolutely everything except rails stays the same, you can't just walk away and expect it to keep working.
Building something on rails isn't a capital investment where you lean back and productivity/money pours out of it. On the continuum of this, its on the far edge of continuous labor required. More like a million dudes building the pyramids by hand than like one dude building a crane.
I'm not a Rails fan. I've overseen the development of about a dozen medium sized Ruby web apps in my career and Sinatra gets us up and going much faster. Also, the relative lack of magic compared to Rails makes uptake for new team members 10x quicker.
With that stated, this article is pretty weak on substance for such a provocative title. I am tempted to flag it as sensational link-bait but Rails discussion is about as on-topic as it gets for HN so I'll resist.
Anyway, the article fails to demonstrate "Why Nobody Should Use Rails". The concurrent requests issue is a valid criticism but that's about the only substantive claim in the article. People should use Rails when it's the right tool for the job. In my experience, Rails shines in situations when there isn't a strong architectural lead managing the project. Rails is very opinionated so it forces disparate developers into a more cohesive application structure where you'd normally end up with a spaghetti-code special.
*edit: I was unaware of this but comments suggest Rails has supported concurrency for a long while, it was just off by default.
This assertion is, categorically and empirically, ridiculous. Yes, the runtime and framework are slow compared to something like go and Revel, but that is not the principal benefit of Rails. Rails' principal benefit is taking care of Maslow's Hierarchy of Needs, or whatever its analogue is for developers. The most recent Rails app I wrote has average render times of 7ms (excluding network latency) on dedicated hardware largely because most views can be rendered with just 3 memcached hits. What's more, implementing the nested caching strategy necessary to achieve this was pretty simple with the out-of-the-box capabilities of Rails 4. If I'm busy worrying about lower-level concerns it doesn't matter if my runtime and framework are blazing fast - I don't have the confidence or time to implement strategies like this, or at least not with nearly as little effort. The proof is in the pudding. Anecdotes to the contrary are welcome.
[+] [-] bascule|12 years ago|reply
It never gained widespread adoption because it was off by default. Rails 4 turns it on by default on multithreaded web servers like Puma.
This isn't true on JRuby, where Ruby threads are 1:1 with JVM threads (which are in turn 1:1 with native threads) that execute in parallel on multiple cores.At my employer (Square) we run many of our Rails apps in thread safe mode on top of JRuby, providing parallel request processing across multiple cores on a single JRuby/JVM instance.
[+] [-] el_bhs|12 years ago|reply
I'd be curious to hear how Square has found ruby (independent of rails) to be from a maintenance standpoint.
[+] [-] josevalim|12 years ago|reply
Actually, even on YARV you will use more than one CPU, however the GVL won't allow them to be used efficiently.
My point is: there is nothing in Rails forcing it being multiplexed into a single CPU, it is all about which language implementation and web server you choose.
The post is also very inconsistent on its own arguments. If multi-core and being able to share memory in between cores is a criteria, Node is also not a good option compared to what you have in the JVM, Erlang VM, Haskell, Go, etc.
[+] [-] VLM|12 years ago|reply
I'm sure when rails 50 ships people will still be claiming it can't multithread.
[+] [-] kawsper|12 years ago|reply
[+] [-] ef4|12 years ago|reply
Building a non-trivial app on Node requires that you do an awful lot of plumbing. Yes, the performance is very nice. And it's going to be great when the community grows up a bit and things start to Just Work with each other. But that day is not here yet.
A lot of the components that people depend on are still really buggy. In the last three days I've had to debug and fix four bugs in other people's npm modules. The community's attention is spread all over the place, because there isn't one dominant way of doing things.
People can't even agree on sane interoperable packaging for code that needs to work on both browsers and in node. It's a mess.
[+] [-] mwcampbell|12 years ago|reply
[+] [-] bwy|12 years ago|reply
[+] [-] jfb|12 years ago|reply
I'd love to read a good hit piece on Rails, written by someone who spends the time to lay out a strong technical justification for their opinion. This was disappointingly Twitterish.
[+] [-] slackz|12 years ago|reply
I would very much appreciate a more thorough, thoughtful discussion of the pros and cons of various design decisions in rails. It does a lot of things right which is why you see people modeling various tools after rails (database migrations for example... I recently wrote a clojure plugin that closely emulates the simplicity of making schema changes ( https://github.com/ckuttruff/clj-sql-up )).
I work with rails every day and lots frustrates me about it, but it's also quite effective for a lot of things (which may be why "everyone and their dog" seems to use it). If you really want to have any influence over this fact, it would help tremendously to create a more compelling argument; I'm sure this would inspire much more interesting dialogue
[+] [-] jasonwatkinspdx|12 years ago|reply
[+] [-] ThunderChief|12 years ago|reply
This piece highlights the inherent drawback of all dynamic languages: performance. We've been through the performance argument too many times to count with both Rails and PHP.
I don't know what kind of taste I would have if I were to follow this piece as advice. Am I expected to do CGI in straight C? Ridiculous.
[+] [-] ape4|12 years ago|reply
[+] [-] amalag|12 years ago|reply
[+] [-] aaronbrethorst|12 years ago|reply
Use the right tool for the job. And claiming that Rails is never the right tool is just silly.
[+] [-] amalag|12 years ago|reply
[+] [-] mwcampbell|12 years ago|reply
> This post is my attempt to be fair, objective, and, by consequence, unrelentingly negative about Rails :)
To me it seems arrogant to assume that you are being fair and objective when you only point out the negative attributes of something. I'd be more inclined to assume that there are positives I'm overlooking, that might even outweigh the negatives. Especially for something as beloved by developers as Rails.
What evidence is there that running a process per core has a significant negative effect on throughput and/or latency in practice? Benchmarks? Data from developers who tried both approaches while holding all else constant? (The latter is probably quite difficult in practice.) Also, JRuby supports real threads without a global interpreter lock.
I think I agree with this post about the benefits of static typing, though.
[+] [-] el_bhs|12 years ago|reply
I agree that the post would be more compelling if I wrote a benchmark to demonstrate how much faster an in-memory cache is than an off-process or off-machine cache.
From first principles, though, I believe it should be obvious (yes?) that the ~300 nanoseconds it takes to grab a read lock and read from main memory is going to beat the ~1000000 nanoseconds it takes to get a response back from a remote cache over the network. Inasmuch as an application blocks on such cache reads, these sorts of things add up to troublesome latency numbers (and Rails – or at least dallistore – does indeed block on reads like these).
JRuby was off the table for the place I used rails due to reliance on some C extensions.
And I'm sorry if the argument seemed arrogant: I was being tongue-in-cheek about the "unrelenting negativity" part. My point about objectivity was that I tried not to rely on my opinions as much as demonstrable statements. That said, I didn't take the time to actually demonstrate most of those, and for that, shame on me.
[+] [-] kevingadd|12 years ago|reply
A post that says 'why you should or shouldn't use rails' is a different post and no more or less implicitly objective. Objectivity does not mean 'covering two sides of an argument'; sometimes one side is less or more supported than the other and sometimes you simply don't have the knowledge to cover both sides fairly.
[+] [-] hatsix|12 years ago|reply
[+] [-] anko|12 years ago|reply
> And it’s no wonder. Until Rails 4, the core framework didn’t even support concurrent request handling. And even in Rails 4, concurrent request handling is multiplexed on a single CPU core, leaving Rails unable to take advantage of the parallelism of modern architectures.
Every single rails installation I have seen runs multi-process and takes advantage of multiple cores. And then he goes on to say something similar, but says thats a disadvantage because the processes aren't using shared memory to communicate and have to use memcached.
The thing is, the rails model means rolling restarts are a lot easier, and you are lot more flexible with deployment strategies.
> However, modern dynamic languages (and their incapacity to do the sort of meaningful pre-runtime verification of basic semantic well-being one expects from a compiler) place a burden on test coverage. Of course it is essential in any language to provide test coverage for core algorithms and other subtle aspects of a software module. However, when trying to “move fast" and get to market quickly, one shouldn’t have to write tests for every souped-up accessor method or trivial transformation.
I'm still undecided about this angle. I simply don't find types to be a problem. While I agree fixed types are more performant, not dealing with types makes the code more fun because you're not held back from running it because you forgot a typecast.
Tests are essential, but the flipside is that tests are quicker to write in a duck-typed language, too.
The post of obviously trolling for clicks - even the title is saying nobody should use rails, and then it gives his times when you should use it at the bottom.
[+] [-] mbesto|12 years ago|reply
Facebook stopped growing because it chose to use PHP.
Pinterest stopped growing because it chose to use Django.
Tumblr stopped growing because it chose to use PHP.
Instagram stopped growing because it chose to use Django.
StackOverflow stopped growing because it chose to use .NET.
37Signals stopped growing because it chose to use Rails.
Groupon stopped growing because it chose to use Rails.
</sarcasm>
Can we end this debate already?
[+] [-] el_bhs|12 years ago|reply
[+] [-] nivertech|12 years ago|reply
And I think ASP.NET has a good performance comparing to rails (not sure if they used C# or VB, C# obviously not a dynamic language).
The other one - 37Signals obviously experts in Rails - so they knew how to scale it. And even they took external funding later.
I think that dynamic languages like Erlang/Elixir, Clojure (and maybe Go) offer a good performance/productivity ratio.
Anyway, I think that Rails is OK for paid B2B web applications, not so much for free consumer stuff.
[+] [-] renownedmedia|12 years ago|reply
</troll>
[+] [-] fleitz|12 years ago|reply
It's not designed for low-profit per request applications. Recent benchmarks suggest about 400 req/sec on a $10 digiocean server suggesting if you max your machine 24/7 that you'd need to make at least 1 cent per million requests, if your profit margin is below this then rails would not be a good solution.
[+] [-] joevandyk|12 years ago|reply
Maybe possible if you cache everything, but that can be quite complex.
[+] [-] lmm|12 years ago|reply
[+] [-] gphil|12 years ago|reply
When I look into other ecosystems, the amount of library support for building modern web applications just doesn't compare. I think it's this aspect of Rails that really helps get from zero to viable product so quickly.
That said, the performance issues do start to take a toll after a while. And forget it if you want to do something counter to the way Rails want you to do it.
Everything is a trade-off.
[+] [-] facorreia|12 years ago|reply
For reasons like this, and I would add, scary security history, I decided against using Rails.
I've experimented with Django and lightweight Python frameworks, Node.js and Java 6 EE, among others.
What has worked best for me was C# + ASP.NET MVC. The C# language has several features that I find appealing and lead to clean, efficient code (such as dynamic, lambda, LINQ and async, among others). The modern incarnation of ASP.NET running under IIS is quite efficient and productive both in development, profiling & diagnostics, and in production.
[+] [-] guelo|12 years ago|reply
[+] [-] kevingadd|12 years ago|reply
ASP.net is showing its age but there are some solid libraries, reasonably comprehensive documentation, and some nice new frameworks like MVC out there to use (I didn't even use a framework, just wrote 500 lines of wrapper logic so I could expose some REST services that handled JSON input/output and then built a connection pool for my Redis connections). If you know C# (or another .NET language) well enough, you can lean heavily on the type system and write automated tests for everything else and get all your errors/red tests displayed to you in your IDE. I hit very few runtime errors while building the services.
It's also pleasantly surprising how easy ASP.net deploys seem to be: rsync your app folder to the web server; the app.config and bin/ folders inside ensure that all your dependencies and configuration move over to the target machine. Mono's ASP.net server seems to support almost everything Microsoft's does, so I think I only ran into one behavioral difference (and it was documented, albeit a little hard to find).
ASP.net also has some of those same code reuse benefits you get with node.js, just in a different direction. Now if you have any native C# applications or libraries for doing interesting things, you can expose them as a web service trivially. JSIL's online sandbox (http://jsil.org/try) is literally just the compiler libraries deployed to a linux box with a 100-line shim over them that does compilation and caching and error reporting.
[+] [-] lmm|12 years ago|reply
[+] [-] huntedsnark|12 years ago|reply
The thing is, by the criteria and benchmarks the author of this post is using, asp.net MVC would be relegated to the dust bin for the same reason Rails was; it's at the bottom of his performance graph.
[+] [-] jsnk|12 years ago|reply
>If you already know how to get things done in Rails, you’re in a hurry, you don’t need to maintain what you’re building, and performance is not a concern, it might be a good choice.
This is exactly why Rails will remain as the goto framework for many developers.
[+] [-] patrickwiseman|12 years ago|reply
The benchmarks of linux on a physical machine rather than a virtual machine is not representative of performance for (gawd help me) cloud-centric deployments. These benchmarks assume the maximum benefit from compiler optimizations that would not be available on a virtualized machine.
I'm not sure what compilers check more than syntactical errors. Those aren't any slower to fix in a dynamic language. I'd recommend checking on Sandi Metz on testing
The rest is just language preference. And my preference is ruby. Python is fine with me too. Javascript makes me a sad panda, but it runs in all the browsers and it's a lot better with the magic of coffeescript.
[+] [-] conorh|12 years ago|reply
[+] [-] degobah|12 years ago|reply
1. Rails is slow
2. Dynamic languages are bad
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] baddox|12 years ago|reply
[+] [-] everettForth|12 years ago|reply
Who bases their sole criterion for a web framework on speed, and then ignores the baked-in caching functionality that a framework provides?
[+] [-] cellis|12 years ago|reply
http://www.techempower.com/benchmarks/
I'm building a site in rails (will never need to be high throughput, and if it does i'll be rich as fuck), so this is interesting info to have.
[+] [-] williamcotton|12 years ago|reply
It's not awesome for a lot of other things.
I don't really get the point of this article and the author seems to be contradicting himself all over the place. Like, how can you possibly talk about the pitfalls of dynamic typing and then a paragraph later tell people to use JavaScript or Python? WTF?
Most problems at scale have to do with I/O, which Go isn't going to help you with. Your data stores are gonna be your pain points.
And oh boy, talk about a mentality of premature optimization... seriously folks, worry more about making something that someone is gonna want to use than how many requests per second the thing can serve. Right now your product has zero requests per second so you could just fulfill them by hand if you had to.
[+] [-] dnr|12 years ago|reply
FWIW, my biggest issue with Rails (and much of what's produced by the Ruby community) is magic: I find the amount of stuff that's implicit, either depending on naming conventions, or automatic inclusion, or overriding default language behavior, or whatever, to be absolutely maddening. And as magic, it's mostly ungreppable and ungoogleable. Trying to figure out what exactly a given line of code does can take way too long. If I was writing an app from scratch as the only developer, I suppose this would be ok, but trying to maintain an existing app, or working as a team, this is a major problem.
The (lack of) speed in my local development environment has been pretty annoying as well.
[+] [-] VLM|12 years ago|reply
"If you already know how to get things done in Rails, you’re in a hurry, you don’t need to maintain what you’re building, and performance is not a concern, it might be a good choice. Otherwise, never."
Take the first line and the last line and it sounds like Rails works perfect in an intra-nettish application as a front end to lots of data in a database. I have a lot of experience with that. It does work really well... for awhile.
Needless to say only having a couple hundred theoretical possible users means that worrying about handling 400 reqs/second doesn't come up as an issue very much.
His line about rails maintenance is dead on and the source of much internal push to run (not walk) away from rails. Push something out to users on rails 1.1 or whatever from 2007 and it won't run in 2013. A rails app needs constant continuous rewriting just so an apt-get upgrade won't kill it. Can't just deploy and walk away like a perl CGI script. Even if absolutely everything except rails stays the same, you can't just walk away and expect it to keep working.
Building something on rails isn't a capital investment where you lean back and productivity/money pours out of it. On the continuum of this, its on the far edge of continuous labor required. More like a million dudes building the pyramids by hand than like one dude building a crane.
[+] [-] vectorpush|12 years ago|reply
With that stated, this article is pretty weak on substance for such a provocative title. I am tempted to flag it as sensational link-bait but Rails discussion is about as on-topic as it gets for HN so I'll resist.
Anyway, the article fails to demonstrate "Why Nobody Should Use Rails". The concurrent requests issue is a valid criticism but that's about the only substantive claim in the article. People should use Rails when it's the right tool for the job. In my experience, Rails shines in situations when there isn't a strong architectural lead managing the project. Rails is very opinionated so it forces disparate developers into a more cohesive application structure where you'd normally end up with a spaghetti-code special.
*edit: I was unaware of this but comments suggest Rails has supported concurrency for a long while, it was just off by default.
[+] [-] SiliconAlley|12 years ago|reply