top | item 7809334

(no title)

workhere-io | 11 years ago

Don't think you are gaining any SEO-benefit from one-page JS-only applications, just because Google made it possible for you to start ranking.

No one is expecting to get any SEO benefits that "normal" pages don't have. We are expecting to get the same chance of ranking as normal pages.

You mentioned that single page apps might rank differently or worse than normal pages. Do you have any source for that? (A source that is current, since Googlebot's improvements are quite new).

discuss

order

blauwbilgorgel|11 years ago

>We are expecting to get the same chance of ranking as normal pages.

Then you should probably adjust this expectation. You say in your article:

>While having this sort of HTML fallback was technically possible, it added a lot of extra work to public-facing single page apps, to the point where many developers dropped the idea...

A JS-driven site with an HTML fallback is a normal page. Then you don't need any tricks or force Google to run your application and hopefully make pages out of them. Start with the fall-back and enhance.

This is a serious mistake with consequences. Tor bundle and Firefox shipped with JavaScript support, because disabling JS broke too much of the current web. It causes accessibility issues (remember when Twitter changed to hash-bang URL's?), if not for Googlebot, then for regular users (From the Webmaster Guidelines):

>Following these guidelines will help Google find, index, and rank your site.

>Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

>Make pages primarily for users, not for search engines.

I am still going on the assumption that you created a one-page application without a time-consuming fallback, and you rely on Google to make rankable pages from them. Then you leave some users standing in the cold, so why deserve to rank equal to a user-friendly accessible web page?

> ... single page apps might rank differently or worse than normal pages ...

From the original article, the most current source on this:

> Sometimes things don't go perfectly during rendering, which may negatively impact search results for your site.

> It's always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn't have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can't execute JavaScript yet.

> Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.

> Some JavaScript removes content from the page rather than adding, which prevents us from indexing the content.

In the SEO community Googlebot's improvements were noted for a while now. See for example: http://ipullrank.com/googlebot-is-chrome/

Single page websites or application-as-content-website's are not popular among SEO's. One reason for this is that it doesn't allow for fine-grained control on keyword targeting, and keeping the site canonical, and it can waste domain authority when you have less targeted pages in the index than you can rank for. Experiment and find out for yourself.