While it's not surprising to me that the pure java implementation was far and ahead better since: 1.) Java's much faster than Javascript, and 2.) Netty (which Vert.x is based on), has been doing high performance async IO for years before node even existed in extremely high performance environments.
What is surprising however, is that javascript on the JVM w/ vert.x is faster than V8 with node. In both cases I would assume not much JS is executed, since in Vert.x most of the executing code is still Java, with a tiny piece of JS scripting on top, and in the Node.js case, I was under the impression that most of the executing code would be the C code that powers its HTTP server.
Does anyone with knowledge of the node internals know what's going on here? Is Node's HTTP parser just slower? Is its reactor not as efficient?
I would also be interested in the internals. I'd expect JS + a small C event loop to outperform compiled JVM code + lots of async Java I/O libs (for a micro benchmark, at least).
But there's no question Netty is fast even on a non-benchmark. I'm using it for a long-polling server with upwards of 100K connections, and my CPU doesn't go much over 5%. Besides memory usage, the main limitation has been proper kernel configuration.
V8 does heavy code optimization and Node.js http sever code should be well optimized as well. And maybe the load balancer for the multicore Node test isn't optimized enough. Thus, these results feel a little shady. Anyway, Vert.x was the trigger that I am finally downloading the JVM (JDK) to try Vert.x (and maybe later Clojure).
But there's still one major drawback—the non-existant ecosystem. I know the hint to look for libs from the Java world but I need a concrete and precise guide how to do this. Let's say I want to plugin some Java lib for image manipulation. How? And who will guarantee that these libs will be concurrent and/or non-blocking as well? The lib developer or Vert.x? At the moment there is—except few out-of-the-box modules—nothing. No 3rd party libs, no module manager a la npm and no guide or documentation how to glue Java libs to this Vert.x thing. Nothing. Correct me if I'm wrong.
The polling loop and I/O interfaces with Vert.x is integrated in to the VM, whereas with Node.js, you'll calling out to libuv/libev/etc. That gives the JVM an advantage. Really, how much actual JavaScript code is V8 JIT'ing at all?
Really what you are comparing here is the efficiency of the paths between the two JavaScript engines and their polling I/O libraries, more than any JIT work (and let's face it, even if we were comparing JIT work, the Sun JVM has had a lot more time to tune and tweak than V8 has). In Node's case, it's talking through libuv, which means it has to transition in-and out of the V8 engine's runtime and in to a C runtime. Even if the C code is super zippy, that's costly. For Vert.x though, all the I/O and polling is integrated in to the runtime/VM/JIT. That's kind of nice.
An interesting way to test what I'm suggesting would be to do the same test with unix domain sockets/named pipes as the transport instead of TCP. The JVM doesn't have a native implementation, so it'd have to call out through JNI. I'd wager even odds it ends up slower than TCP on localhost.
A rigorous benchmark must seek to explain the difference in performance, not just hand-wave saying "such-and-such is faster". Otherwise, you have no way of validating (for yourself, let alone demonstrating to others) that it's not a misconfiguration or a flaw in your benchmark.
Node.js is not a fad. It represents the first workable JavaScript-based server with mass appeal. The real story is that JavaScript is here to stay. There have been other server-side JavaScript frameworks in the past, but none of them have taken off like Node.js. If Vert.x wins over the JavaScript crowd, then that's great, because coders will be able to write JavaScript.
I wouldn't dismiss the node.js stack quite so easily. Those "proven" technologies you mention had to go through their own lifecycle of continued improvement.
I remember a time when my colleagues who were steeped in C++ had a good laugh at my expense because I was building server-side web applications with a new framework and a hot, new language. It woefully under-performed similar C++ applications in benchmark tests.
It was 1998, and the language was Java. I could write my applications much faster and in a more maintainable way than they could, but they didn't care. Their technology was proven, and Java was simply a fad.
Not really. It shows that this benchmark is crap (likely benchmarking disk io versus disk io + some caching). Read Isaac's comment for more detail. He sums it up pretty well. No profiling info, using a custom test, no analysis besides some pretty graphs.
I have a hard time believing the JVM is really 10x faster than v8 for such a simple server.
This is consistent with a benchmark I did a while back comparing Netty vs. Node.js. Not surprising since Netty powers Vert.x. Netty is pretty amazing. 500K concurrent connections and not batting an eye.
According to the code, it actually does a file read on every request -- this is certainly suspicious because some implicit caching may significantly change the results.
With regards to Vert.x, it seems like really cool stuff.
On the blog post about version 1 being released it mentions being able to mix and match several programming languages. Does this mean you can use different libraries written in different languages in the same Vert.x server?
Yes indeed. You can mix and match Java, JavaScript, Ruby and Groovy in the same app.
We hope to support more languages going ahead (e.g. Java, Scala, ...)
Yet another meaningless micro benchmark. There is no point in measuring a hello world http request in one framework vs. another.
There should be a larger application that even remotely resembles some kind of real world usage. Maybe some day we'll have some kind of a "standard" for a web framework benchmark, an application that actually does something so it's worth benchmarking.
Hardly meaningless. The problem is that the larger application benchmarks fall prey to accusations of "I wouldn't write it that way!". Their results are just as hotly disputed.
This micro-benchmark makes sense. It's benchmarking an HTTP server essentially. Any benchmark further than this would really be benchmarking the JVM and V8. While that would be interesting, I think in this case, this micro benchmark is OK as long as you know its limitations.
The reality is that most people's app code and databases are going to bottleneck long before either Vert.x or Node.js do. The main thing this benchmark clears up is that both of them are really, really, fast, and that if you do lots of simple to process responses you may want to go with Vert.x.
I agree the "micro" benchmark isn't something people should look to as a definitive answer, but I don't think they should be outright dismissed either. If nothing else, they should be a jumping off point for real testing.
I'd upvote this more than once if I could. The beauty and efficiency of a web framework must be measured as a trade off between the amount of instructions executed to check for corner cases etc. Anyone could write a framework that executes hello world faster than a popular one if they just assume for instance that no one will ever make a POST request or use a query string. That wouldn't be a very useful framework, but it could definitely execute GET '/' fast.
These micro-benchmarks aren't even a good kicking off point for comparisons, as the things that yield a trivial benchmark win are often the things that yield significant performance troubles at scale.
Very exciting initial results. The JVM simply is the most optimized runtime available right now. And Java is the dynamic language with best performance. Can't fail to notice Ruby is the slowest even on JVM. If you continue on this path and refine your APIs to be more user friendly, this would be the next big asynchronous server out there!
I had the impression that "Node.js (readfile)", and possibly "Node.js" meant the blocking call but that "Node.js (streams)" meant he was using something like fs.createReadStream(). But you're right, I don't see that anywhere in the posted source.
I'm interested to know how vert.x compares to industrial-strength "traditional" servlet containers. My guess is that vert.x would outperform them under certain conditions, but all in all they would scale better.
I believe servlets are still the most scalable web stack out there.
Came here to say this. While microbenchmarks are fun for all, in the real world you would have a completely different reason for choosing node.js that has nothing to do with this kind of performance. So just use the best tool for what is being benchmarked here: nginx.
I'm going to ask a dumb question as someone who is just learning node. What's wrong with serving assets out of public/ in an Express app? Why would someone not want to do this?
So a quick verification. The io is the difference between these two. The JVM is doing some caching somewhere, whereas the v8 engine is not. Making a small change to both (in order to ensure that both are using the exact same logic):
Why are they measuring requests/sec? Any server can accept connections at a high rate but what matters is responding in a timely manner.
I doubt the requests number too. Writing a dummy socket server (evented, threaded, ...) that just returns "HTTP/1.1 200 OK" will not get you anywhere close to 120k requests/sec. The system call becomes the bottleneck.
People in the B community come out in outrage saying that the testing is flawed, that microbenchmarking is useless, that this and that. Rinse and repeat.
[+] [-] andrewvc|14 years ago|reply
While it's not surprising to me that the pure java implementation was far and ahead better since: 1.) Java's much faster than Javascript, and 2.) Netty (which Vert.x is based on), has been doing high performance async IO for years before node even existed in extremely high performance environments.
What is surprising however, is that javascript on the JVM w/ vert.x is faster than V8 with node. In both cases I would assume not much JS is executed, since in Vert.x most of the executing code is still Java, with a tiny piece of JS scripting on top, and in the Node.js case, I was under the impression that most of the executing code would be the C code that powers its HTTP server.
Does anyone with knowledge of the node internals know what's going on here? Is Node's HTTP parser just slower? Is its reactor not as efficient?
[+] [-] sehugg|14 years ago|reply
But there's no question Netty is fast even on a non-benchmark. I'm using it for a long-polling server with upwards of 100K connections, and my CPU doesn't go much over 5%. Besides memory usage, the main limitation has been proper kernel configuration.
[+] [-] tferris|14 years ago|reply
V8 does heavy code optimization and Node.js http sever code should be well optimized as well. And maybe the load balancer for the multicore Node test isn't optimized enough. Thus, these results feel a little shady. Anyway, Vert.x was the trigger that I am finally downloading the JVM (JDK) to try Vert.x (and maybe later Clojure).
But there's still one major drawback—the non-existant ecosystem. I know the hint to look for libs from the Java world but I need a concrete and precise guide how to do this. Let's say I want to plugin some Java lib for image manipulation. How? And who will guarantee that these libs will be concurrent and/or non-blocking as well? The lib developer or Vert.x? At the moment there is—except few out-of-the-box modules—nothing. No 3rd party libs, no module manager a la npm and no guide or documentation how to glue Java libs to this Vert.x thing. Nothing. Correct me if I'm wrong.
[+] [-] cbsmith|14 years ago|reply
Really what you are comparing here is the efficiency of the paths between the two JavaScript engines and their polling I/O libraries, more than any JIT work (and let's face it, even if we were comparing JIT work, the Sun JVM has had a lot more time to tune and tweak than V8 has). In Node's case, it's talking through libuv, which means it has to transition in-and out of the V8 engine's runtime and in to a C runtime. Even if the C code is super zippy, that's costly. For Vert.x though, all the I/O and polling is integrated in to the runtime/VM/JIT. That's kind of nice.
An interesting way to test what I'm suggesting would be to do the same test with unix domain sockets/named pipes as the transport instead of TCP. The JVM doesn't have a native implementation, so it'd have to call out through JNI. I'd wager even odds it ends up slower than TCP on localhost.
[+] [-] rapala|14 years ago|reply
[+] [-] courtneycouch|14 years ago|reply
nodejs beats Vert.x for example. People shouldn't be so quick to accept these half thought out microbenchmarks.
[+] [-] wh-uws|14 years ago|reply
I don't think these are speed tests.
Just number of connections that can be handled
[+] [-] jlouis|14 years ago|reply
I've seen my share of web servers which are very different in their compliance.
[+] [-] IsaacSchlueter|14 years ago|reply
[+] [-] IsaacSchlueter|14 years ago|reply
[+] [-] prospero|14 years ago|reply
[+] [-] dap|14 years ago|reply
[+] [-] pjmlp|14 years ago|reply
[+] [-] fruchtose|14 years ago|reply
[+] [-] jroseattle|14 years ago|reply
I remember a time when my colleagues who were steeped in C++ had a good laugh at my expense because I was building server-side web applications with a new framework and a hot, new language. It woefully under-performed similar C++ applications in benchmark tests.
It was 1998, and the language was Java. I could write my applications much faster and in a more maintainable way than they could, but they didn't care. Their technology was proven, and Java was simply a fad.
[+] [-] deelowe|14 years ago|reply
I have a hard time believing the JVM is really 10x faster than v8 for such a simple server.
[+] [-] ww520|14 years ago|reply
[+] [-] mbq|14 years ago|reply
[+] [-] ww520|14 years ago|reply
Also file IO are heavily cached by the OS. You can bet that one file is read from memory most of the time. Disk IO is pretty much out of the equation.
[+] [-] jwingy|14 years ago|reply
On the blog post about version 1 being released it mentions being able to mix and match several programming languages. Does this mean you can use different libraries written in different languages in the same Vert.x server?
[+] [-] purplefox|14 years ago|reply
[+] [-] exDM69|14 years ago|reply
There should be a larger application that even remotely resembles some kind of real world usage. Maybe some day we'll have some kind of a "standard" for a web framework benchmark, an application that actually does something so it's worth benchmarking.
[+] [-] andrewvc|14 years ago|reply
This micro-benchmark makes sense. It's benchmarking an HTTP server essentially. Any benchmark further than this would really be benchmarking the JVM and V8. While that would be interesting, I think in this case, this micro benchmark is OK as long as you know its limitations.
The reality is that most people's app code and databases are going to bottleneck long before either Vert.x or Node.js do. The main thing this benchmark clears up is that both of them are really, really, fast, and that if you do lots of simple to process responses you may want to go with Vert.x.
[+] [-] manuscreationis|14 years ago|reply
[+] [-] zafriedman|14 years ago|reply
[+] [-] huggyface|14 years ago|reply
http://blog.yafla.com/Pet_Store_2011__Metrics_Instead_of_Eva...
These micro-benchmarks aren't even a good kicking off point for comparisons, as the things that yield a trivial benchmark win are often the things that yield significant performance troubles at scale.
[+] [-] anuraj|14 years ago|reply
[+] [-] alz|14 years ago|reply
https://gist.github.com/2650401
Even with os caching there is still quite a bit of overhead there. Would be interesting to see the benchmarks run on the corrected code:
https://gist.github.com/2650401
[+] [-] unknown|14 years ago|reply
[deleted]
[+] [-] DTrejo|14 years ago|reply
If you're worried about your programs containing rogue & misbehaving code like this, I recommend you use https://github.com/isaacs/nosync
[+] [-] nfriedly|14 years ago|reply
[+] [-] pron|14 years ago|reply
I believe servlets are still the most scalable web stack out there.
[+] [-] simonw|14 years ago|reply
[+] [-] binarymax|14 years ago|reply
[+] [-] trimbo|14 years ago|reply
[+] [-] zafriedman|14 years ago|reply
[+] [-] courtneycouch|14 years ago|reply
https://gist.github.com/2652991
and then I get the following results:
vert.x:
39890 Rate: count/sec: 3289.4736842105262 Average rate: 2958.1348708949613 42901 Rate: count/sec: 2656.924609764198 Average rate: 2936.994475653248 45952 Rate: count/sec: 3277.613897082924 Average rate: 2959.610027855153
node.js:
38439 Rate: count/sec: 4603.748766853009 Average rate: 4474.62212856734 41469 Rate: count/sec: 4620.4620462046205 Average rate: 4485.278159589091 44469 Rate: count/sec: 4666.666666666667 Average rate: 4497.515122894601
Making that change so they both store the file in memory and nodejs is 50% faster than vert.x.
This is using an m1.small instance on EC2, and both vert.x and nodejs only using a single core.
[+] [-] purplefox|14 years ago|reply
And artificially crippling Vert.x to a single core does not prove anything. Anybody who cares about performance will be using more than one core.
[+] [-] tipiirai|14 years ago|reply
[+] [-] smagch|14 years ago|reply
[+] [-] elliotlai|14 years ago|reply
[+] [-] halayli|14 years ago|reply
I doubt the requests number too. Writing a dummy socket server (evented, threaded, ...) that just returns "HTTP/1.1 200 OK" will not get you anywhere close to 120k requests/sec. The system call becomes the bottleneck.
[+] [-] purplefox|14 years ago|reply
I.e from request to corresponding response and how many of those it can do per second.
If you doubt the numbers please feel free to run them yourselves, all code is in github
[+] [-] ww520|14 years ago|reply
[+] [-] anonhacker|14 years ago|reply
[+] [-] purplefox|14 years ago|reply
[+] [-] VeejayRampay|14 years ago|reply
People in the B community come out in outrage saying that the testing is flawed, that microbenchmarking is useless, that this and that. Rinse and repeat.