top | item 10388606

Deprecating our AJAX crawling scheme

144 points| antichaos | 10 years ago |googlewebmastercentral.blogspot.com

77 comments

order

m0th87|10 years ago

Don't believe the hype. Google has been saying that they can execute javascript for years. Meanwhile, as far as I can see, most non-trivial applications still aren't being crawled successfully, including my company's.

We recently got rid of prerender because of the promise from the last article from google saying the same thing [1]. It didn't work.

1: http://googlewebmastercentral.blogspot.com/2014/05/understan...

thoop|10 years ago

Todd from Prerender.io here. We've seen the same thing with people switching to AngularJS assuming it will work and then coming to us after they had the same issue.

[1] This image is from 2014, when Google previously announced they were crawling JavaScript websites, showing our customer's switch to an AngularJS app in September. Google basically stopped crawling their website when Google was required to execute the JavaScript. Once that customer implemented Prerender.io in October, everything went back to normal.

Another customer recently (June 2015) did a test for their housing website. They tested the use of Prerender.io on a portion of their site against Google rendering the JS of another portion of their site. Here are the results they sent to me:

Suburb A was prerendered and Google asked for 4,827 page impressions over 9 days Suburb B was not prerendered and Google asked for 188 page impressions over 9 days

We've actually talked to Google some about this issue to see if they could improve their crawl speed for JavaScript websites since we believe it's a good thing for Google to be able to crawl JavaScript websites correctly, but it looks like any website with a large number of pages still needs to be sceptical about getting all of their pages into Google's index correctly.

1: https://s3.amazonaws.com/prerender-static/gwt_crawl_stats.pn...

mixonic|10 years ago

I noticed our companies Ember.js based SPA site not being indexed well until I added a sitemap. Then it quickly appeared in the rankings.

Historically Google has been using some fork of Chrome 10 when indexing. I'm unsure what impact that is having on the reliability of app rendering, but I also trust the Google search team has done reasonable checks ensuring common sites and frameworks render correctly.

I strongly suggest using a sitemap for JS rendered sites, based on my own experience.

cotillion|10 years ago

So they're actually evaluating all js and css Googlebot is consuming. That's insane.

Can we forget about any new competitors in search engine land now? Not only do you have to match Google in relevance you'll actually have to implement your own BrowserBot just to download the pages.

thephyber|10 years ago

The hints were littered everywhere that they did this.

Google does malware detection. Not on every crawl, but a certain percentage of crawls. At my old social network site, they detected malware that must have come from ad/tracking networks because those pages had no UGC. This suggests they were using Windows virtual machines (among others) and very likely using browsers other than a heavily modified curl / wget and a headless Chrome.

They started crawling the JavaScript-rendered version of the web and AJAX schemes that use URL shebangs. This was explicit acknowledgement that they were running JavaScript and did advanced DOM parsing.

They have always told people that cloaking (either to Google crawler IP blocks, user-agent, or by other means) content is a violation and they actively punished it. This suggests they do content detection and likely execute JavaScript to detect if extra scripts change the content of the page for clients that don't appear to be Googlebot.

They have long had measures in place to detect invisible text (eg. white text on white background) or hidden text (where HTML elements are styled over other HTML elements). This suggests both CSS rendering and JS rendering.

sheraz|10 years ago

I don't know that I would go head-to-head with Google in crawling the entire web. However, I do see a lot of opportunities for "vertical search." That is -- search engines focused on specific, niche verticals (travel, healthcare, etc)

I'm working on a couple of projects in vertical search, and it is quite exciting. Sure, I'm building tech that Google had in 2005, but we are surprised with the results. We achieve search relevance simply by curating the sites we crawl (still in the thousands in some cases).

mey|10 years ago

That was my first reaction as well. "We've engineered a competitive advantage so why don't you throw out that hard work the helps our competitors."

I'm not sure where I sit on this, developers who want to be noticed by other engines will continue to focus on SEO, but how many engineers care about SEO that isn't Google?

elorant|10 years ago

You can use one of the many headless browsers available. Selenium, phantomjs, phantomjs+casper, webkit, chromium, awesomium, name your poison. All are quite competent in rendering modern web pages. You don’t need to reinvent the wheel.

suneilp|10 years ago

PhantomJS allows you to render a page and fully manipulate or search it. It's a headless WebKit browser you can use from the command line and it works pretty well. Google is obviously doing the same thing. They even used to show images of what a url looks like in the search results. They stopped doing that as I suspect it uses up a lot of resources of many sites.

gdulli|10 years ago

So they're using headless browsers. Why can't anyone do that?

rdoherty|10 years ago

Wow, I built a project that rendered JS built webpages for search engines via NodeJS and PhantomJS. Rendering webpages is extremely CPU intensive, I'm amazed at the amount of processing power Google must have to do this at Internet scale.

I really hope this works, lots of JS libraries expect things like viewport and window size information, I wonder how Google is achieving that.

MichaelApproved|10 years ago

I'm wonder if they're cutting out a lot of the rendering that PhantomJS is doing. Not to say that any type of rendering is cheap but I'm guessing they have a limited version of a JS rendering engine that does just enough to index the page.

I bet they'd also skip on all the FB like buttons and other common social media elements that don't impact the content.

Jake232|10 years ago

Can confirm. Launched a project recently with over 500 concurrent PhantomJS workers. Let's just say my hosting bill is significantly more expensive than it was.

Figs|10 years ago

> lots of JS libraries expect things like viewport and window size information, I wonder how Google is achieving that.

Just plug in common screen parameters (e.g. 1920x1080, 1366x768, ...) and analyze it as if it were the result you'd get by default with Chrome on such a screen, I would imagine.

elorant|10 years ago

Chrome is much lighter than phantomjs. I use Awesomium which is a .net port of Chromium and it loads pages at half the time phantom does with much less CPU load. My guess is that Google can refine it even further.

gukov|10 years ago

I'm wondering if Google is somehow, in some way, using the rendering data generated by the Chrome clients and/or Android to aid with processing power it takes to index everything.

juliend2|10 years ago

I think they might mitigate the need to crawl _every_ page of every web site in that fashion. They must be doing some sort of analysis to "old-school-crawl" pages that don't need javascript interpretation.

username223|10 years ago

The viewport and window size is probably just for browser fingerprinting. Google probably just grabs the text, and has a pretty efficient fingerprinting system.

anon1mous|10 years ago

The XHTML+XSLT+XML-FO stack produced pages that took 3x-10x less CPU to render. But that's dead of course.

a2tech|10 years ago

This is good-one of my current projects for a customer is entirely AJAX/JS rendered and we were worried that Googlebot would have a fit with it.

greglindahl|10 years ago

You should still be worried. Just because googlebot expensively evaluates JS for some websites doesn't mean it will evaluate JS for your brand-new website. You might get crawled a lot less deeply than if you had good content in your static pages.

devNoise|10 years ago

About a year ago I wrote a post[1] about how I couldn't get google to index my AngularJS app. My main problem was the interaction between googlebot and the S3 server. I'll have to go back and test if the crawler's behavior will render the correct content.

1 - https://medium.com/@devNoise/seo-fail-figuring-out-why-i-can...

iwilliams|10 years ago

We recently built a site for a customer in Ember and their SEO guys were concerned about indexing. I wasn't sure how it was going to work out, but in the end Google has been able to index every page no problem.

espeed|10 years ago

This was the missing piece for Polymer elements / custom web components. Now that Google has confirmed it's indexing JavaScript, web-component adoption should take off.

tracker1|10 years ago

I want to like polymer/web-components... I just find that it kind of flips around the application controls that redux+react offers. I'm not sure that I like it better in practice.

eurokc98|10 years ago

Gary Illyes @goog said this was happening Q1 this year, and like others mentioned lots of other direct/indirect signals have pointed this way.

http://searchengineland.com/google-may-discontinue-ajax-craw... March 5th: Gary said you may see a blog post at the Google Webmaster Blog as soon as next week announcing the decommissioning of these guidelines.

Pure speculation but interesting... The timing may have something to do with Wix, a Google Domains partner, who is having difficulty with their customer sites being indexed. The support thread shows a lot of talk around "we are following Google's Ajax guidelines so this must be a problem with Google". John Mueller is active in that thread so it's not out of the realm of possibility someone was asked to make a stronger public statement. http://searchengineland.com/google-working-on-fixing-problem...

nostrademons|10 years ago

I'm betting that they finally solved the scalability problems with headless WebKit. Google's been able to index JS since about 2010, but when I left in 2014, you couldn't rely on this for anything but the extreme head of the site distribution because they could only run WebKit/V8 on a limited subset of sites with the resources they had available. Either they got a whole bunch more machines devoted to indexing or they figured out how to speed it up significantly.

nailer|10 years ago

Currently I use prerender.io and this meta tag:

    <meta name="fragment" content="!">
I don't actually use #! URLs, (or pushstate, though I might use pushstate in the future) but without both of these Google can't see anything JS generated - using Google Webmaster Tools to check.

Does this announcement mean I can remove the <meta> tag and stop using prerender.io now?

thoop|10 years ago

If Google Webmaster Tools is unable to render your website correctly, then that's a good indicator that Googlebot won't be able to render the pages correctly either. If you remove the fragment meta tag, then Google will need to render your javascript to see the page. Let us know how that goes if you try it! todd@prerender.io

rgbrgb|10 years ago

We have a similar setup and were wondering the same thing (though we use push state). Today we were actually trying to figure out a workaround for 502s and 504s that google crawler was seeing from prerender. We just took the plunge and removed the meta tag because over 99% of our organic search traffic is from google. Fingers crossed!

rcconf|10 years ago

This might be obvious to anyone who has done SEO, but can Googlebot index React/Angular websites accurately? I was always under the impression that the isomorphic aspect of React helped with SEO (not just load times.)

vbezhenar|10 years ago

If a modern browser can render your site accurately, then Google can index it.

jwr|10 years ago

Finally. It was obvious we would have to get to that point eventually, it just wasn't clear when.