I can't be the only one getting tired of sites like this making the front page. Really? It would be one thing to see someone write a legitimate article on why they think the move to JS frameworks is harmful and/or the benefits of using plain JS, and for that to make it to the front page. I'd be interested in that perspective. But this is just somebody being snide. It's the internet equivalent of the kid on The Simpsons that points and goes "ha-ha!" There's no content. We can't have a discussion around a smartass joke like this.
Reading the example code I learned enough to become interested in vanilla javascript. Although satirical in tone, I found it insightful. It lead me to question the necessity of using jQuery for every project.
You're not the only one. Considering that the owner have ads on the site, gives evidence of someone who want to make a quick and easy buck from trolling the community.
Fast and lightweight, yes. But vanilla-js is certainly not cross platform ;) You see, that's actually one of the biggest problems with JS and one of the main reasons why people use things like jQuery (apart from the pretty API..).
If you're only targeting modern browsers (latest Chrome/FF/Opera/Safari, IE10), is this still true? I was under the impression the recent browsers were standards-compliant enough that you could use vanilla JS.
Of course, most developers still have to target older browsers, but for a private Web app where you know your target audience, this shouldn't be a huge issue.
This site could actually be made useful if the little area where you check which "features" you want to include that "generated" the download file actually caused cross-platform, native JavaScript examples of those bits of functionality to be included in a real downloadable file.
If you really think more people should be using plain vanilla JavaScript (and in a lot of places, I think this is actually true and even when a framework is needed, it's good to have the underlying skill) then the way to get them to do that is to educate them on it, not patronize.
Raw Javascript is simply not an option. The API is awful, default types are not powerful enough and cross-platform consistency is a joke. For every decently sized project you start in JS you have to reinvent a thousand wheels to even get rolling, so you better just leverage a framework.
Recently, I was interviewing a front-end developer candidate and I gave them a simple JavaScript problem. The sad thing was, they didn't recognize `document.getElementById()` and didn't know any of the parameters to `addEventListener()`.
We finished the exercise assuming the example used jQuery instead.
maybe I'm mistaken, but I think this is more of "jquery programmer" problem. jQ really abstracts away from the nitty gritty of the dom api and also for any beginner / intermediate frontend guys the source is not very readable. In comparison dojo for example, requires you know what you're doing and also to have pretty extensive knowledge about js, the dom, etc. Also, in my opinion, its source code is very clean in comparison.
Was anyone else surprised by the performance penalty of jQuery? Over 400x hit for getElementsByTagName! I'm curious whether this is more due to cross browser checks or determining the what type of selector was used. To the source!
> Was anyone else surprised by the performance penalty of jQuery? Over 400x hit for getElementsByTagName!
Not really. Just replacing document.getElementsByTagName by document.querySelectorAll (the native, browser-implemented version of what jQuery does) will generate a 150-200x perf hit depending on the browser.
The reason for that is twofold: first, getElementsByTagName doesn't have to parse the selector, figure out what is asked for, and potentially fall back on explicit JS implementation (jQuery supports non-native selectors. In fact, I believe you can implement your own selectors). But the parsing overhead should be minor in this precise case.
Second, the real reason for the difference, getElementsByTagName cheats something fierce: it doesn't return an Array like everybody else, it returns a NodeList. And the nodelist is lazy, it won't do anything before it needs to. Essentially, just calling document.getElementsByTagName is the same as calling an empty function. If you serialize the nodelist to an array (using Array.prototype.slice.call), bam 150~200x perf hit.
I suspect the benchmark is flawed, and charging jQuery for one-time penalties for every function. i.e., if you make a page that has nothing but a getElementById call, vs. loading jQuery and executing the query on the DOM, it's obviously going to be a lot slower. Even if that's not the case, "fast enough" is fast enough...and jQuery has proven itself to be fast enough for a lot of stuff. And, most developers working in JavaScript and building their own helper libraries and such are extremely likely to make worse libraries than jQuery. So, it's probably smarter to use a library that's getting a lot of vetting by really smart people. Whether it's jQuery or something else, I'm not gonna go back to hand-written JavaScript (I'm in the midst of converting an app to use jQuery from handwritten functions, and the new version is either faster or similarly fast, and maintenance of the frontend is getting vastly simpler with every element that gets converted from the handwritten functions to jQuery+Bootstrap...hell, sometimes, I'm able to just use markup with no JavaScript on the page at all...and that's like magic).
Question: is the 'Speed Comparison' for real? I find it really, really hard to believe. Surely jQuery & co. revert to native implementations (if they exists, as they do in all modern browsers) for things like `document.getElementById` and don't iterate over the whole DOM.
Depends what you mean by "for real". The document.getElementsByTagName comparison is bullshit: gEBTN is lazy, just calling it basically doesn't do any work, it just allocates a NodeList object and returns.
If you serialize the nodelist to an array or use document.querySelector instead (it returns an array) you get ~3x between the native version and the slowest libraries, not 400x.
Yeah, Sizzle will grab an element using `document.getElementById()` but only after checking the nodeType of the context, ensuring the selector is a string and running the selector through a regular expression.
I just looked at the latest jquery and its each statement does not do a native fallback, which really surprised me. Also you have to keep in mind that a simple $('a.active').each(someFunc), has to run through the a lot of jq luggage, plus the initial http fetch of the library (which has come under criticism for its size).
I love pure JavaScript. jQuery is overrated and most libraries are only good for one thing or the other, nothing can replace the joy and performance of Vanilla JS!
jQuery is kind of a big download for mobile, so I often skip it for small tasks.
What gets me is when people include jQuery and then further bog things down by loading a lot of plug-ins to do things that could easily be accomplished by adding a few lines of code of their own. Even if you do need and include jQuery, it doesn't mean you have to use it for every piece of javascript in your app.
Many times, a plug-in will do a lot more than you need it to do. If your primary goal is to just get rid of the 300sm delay translating tap events to click events, you don't need a library for full gesture support. You need half a dozen lines to listen for touch events.
If you just need to add some client-side persistence for a few basic things in LocalStorage, you probably don't need a plug-in with a complex query syntax.
Cannibalize a library if you need to and pull out the bits you need. You don't have to include the whole kitchen sink.
I'm surprised by the number of comments like this.
jQuery has to parse the selector and figure out that it's of the form "#id". This requires running a regular expression. A lot more is happening.
The whole jQuery call from start to finish takes 2.85 microseconds, in what is presumably a real benchmark, but microbenchmarks like this are hard to interpret and basically meaningless. But yes, if your app needs to do a burst of 350,000 jQuery calls in a tight loop and you are bummed that the whole thing takes a full second, you should then optimize using document.getElementById.
[+] [-] graue|13 years ago|reply
[+] [-] reedlaw|13 years ago|reply
[+] [-] rgbrgb|13 years ago|reply
[+] [-] mekwall|13 years ago|reply
[+] [-] ashray|13 years ago|reply
That cross platform bit is the weakest link sigh.
[+] [-] mw642|13 years ago|reply
Of course, most developers still have to target older browsers, but for a private Web app where you know your target audience, this shouldn't be a huge issue.
[+] [-] scarmig|13 years ago|reply
[+] [-] dotborg|13 years ago|reply
[+] [-] kellishaver|13 years ago|reply
If you really think more people should be using plain vanilla JavaScript (and in a lot of places, I think this is actually true and even when a framework is needed, it's good to have the underlying skill) then the way to get them to do that is to educate them on it, not patronize.
[+] [-] hcarvalhoalves|13 years ago|reply
[+] [-] RandallBrown|13 years ago|reply
[+] [-] scarmig|13 years ago|reply
Aside from not being cross-platform, VanillaJS is just plain ugly.
[+] [-] Timmy_C|13 years ago|reply
We finished the exercise assuming the example used jQuery instead.
[+] [-] rane|13 years ago|reply
Did you expect your interviewees to have memorized APIs? There's Google so that you can look up those when needed.
[+] [-] madmax108|13 years ago|reply
There always is either underscore or backbone or require or dojo or prototype or yui or jquery...
Don't really know if this is good or bad!
[+] [-] jfaucett|13 years ago|reply
[+] [-] paulrosenzweig|13 years ago|reply
[+] [-] masklinn|13 years ago|reply
Not really. Just replacing document.getElementsByTagName by document.querySelectorAll (the native, browser-implemented version of what jQuery does) will generate a 150-200x perf hit depending on the browser.
The reason for that is twofold: first, getElementsByTagName doesn't have to parse the selector, figure out what is asked for, and potentially fall back on explicit JS implementation (jQuery supports non-native selectors. In fact, I believe you can implement your own selectors). But the parsing overhead should be minor in this precise case.
Second, the real reason for the difference, getElementsByTagName cheats something fierce: it doesn't return an Array like everybody else, it returns a NodeList. And the nodelist is lazy, it won't do anything before it needs to. Essentially, just calling document.getElementsByTagName is the same as calling an empty function. If you serialize the nodelist to an array (using Array.prototype.slice.call), bam 150~200x perf hit.
See http://jsperf.com/vanillajs-by-tag/2 for these two alterations added to the original "vanilla JS" perf cases.
There is a significant overhead to jQuery, but it's ~3x compared to the equivalent DOM behavior, not 400x.
[+] [-] SwellJoe|13 years ago|reply
[+] [-] jdlshore|13 years ago|reply
[+] [-] csaba|13 years ago|reply
[+] [-] elliotlai|13 years ago|reply
[+] [-] pooriaazimi|13 years ago|reply
[+] [-] masklinn|13 years ago|reply
Depends what you mean by "for real". The document.getElementsByTagName comparison is bullshit: gEBTN is lazy, just calling it basically doesn't do any work, it just allocates a NodeList object and returns.
If you serialize the nodelist to an array or use document.querySelector instead (it returns an array) you get ~3x between the native version and the slowest libraries, not 400x.
[+] [-] Timmy_C|13 years ago|reply
[+] [-] jfaucett|13 years ago|reply
[+] [-] chbrown|13 years ago|reply
[+] [-] nivla|13 years ago|reply
[+] [-] suyash|13 years ago|reply
[+] [-] TazeTSchnitzel|13 years ago|reply
[+] [-] kellishaver|13 years ago|reply
What gets me is when people include jQuery and then further bog things down by loading a lot of plug-ins to do things that could easily be accomplished by adding a few lines of code of their own. Even if you do need and include jQuery, it doesn't mean you have to use it for every piece of javascript in your app.
Many times, a plug-in will do a lot more than you need it to do. If your primary goal is to just get rid of the 300sm delay translating tap events to click events, you don't need a library for full gesture support. You need half a dozen lines to listen for touch events.
If you just need to add some client-side persistence for a few basic things in LocalStorage, you probably don't need a plug-in with a complex query syntax.
Cannibalize a library if you need to and pull out the bits you need. You don't have to include the whole kitchen sink.
[+] [-] ranza|13 years ago|reply
[+] [-] Xorlev|13 years ago|reply
[+] [-] pepve|13 years ago|reply
[+] [-] Terretta|13 years ago|reply
[+] [-] gokulk|13 years ago|reply
can be written as
jQuery.post('path/to/api',{banana:yellow},function(data){alert("Success: "+data);});
much simpler and easy than
var r = new XMLHttpRequest(); r.open("POST", "path/to/api", true); r.onreadystatechange = function () { if (r.readyState != 4 || r.status != 200) return; alert("Success: " + r.responseText); }; r.send("banana=yellow");
nevermind got the joke. but i think jQuery helps write faster code sometimes
[+] [-] phylofx|13 years ago|reply
[+] [-] unknown|13 years ago|reply
[deleted]
[+] [-] Tichy|13 years ago|reply
[+] [-] mfenniak|13 years ago|reply
Heh.
[+] [-] aneth4|13 years ago|reply
[+] [-] dgreensp|13 years ago|reply
jQuery has to parse the selector and figure out that it's of the form "#id". This requires running a regular expression. A lot more is happening.
The whole jQuery call from start to finish takes 2.85 microseconds, in what is presumably a real benchmark, but microbenchmarks like this are hard to interpret and basically meaningless. But yes, if your app needs to do a burst of 350,000 jQuery calls in a tight loop and you are bummed that the whole thing takes a full second, you should then optimize using document.getElementById.