vanessafox | 14 years ago | on: How My Popular Site was Banned by Google
vanessafox's comments
vanessafox | 15 years ago | on: Search Still Sucks
(lowercase or just uses "or" as one of the search terms, which I think actually is considered a stop word and is then ignored.)
And I'm always a supporter of the more coffee recommendation.
vanessafox | 15 years ago | on: Search Still Sucks
If I recall correctly, Yahoo tested using this data a few years ago and found the signals not to be as useful as others they used at the time.
Of course, Bing and Google have been working to include more social signals in rankings: http://searchengineland.com/what-social-signals-do-google-bi...
vanessafox | 15 years ago | on: Search Still Sucks
site:http://news.ycombinator.com OR site:stackoverflow.com <query>
I don't know if there's a max # of sites you can add. I tried it with three and it seemed to work great.
http://www.google.com/search?q=site:searchengineland.com+OR+...
(Yes, I did an ego search. Isn't that the best way to know if the results are likely accurate? :)
If there's a set of sites you want to search through often, you can set up a custom search engine for what you describe. For instance, see this page:
vanessafox | 15 years ago | on: NYT Exposes J.C. Penney Link Scheme That Causes Plummeting Rankings in Google
My interpretation of what the combo algorithmic and manual efforts is this:
-One of Google's paid link algorithms (possibly a new one or possibly an existing one that was recently tweaked) flagged some of the links or one of the link networks. This caused those links to no longer count towards PageRank credit (and possibly causing some of the initial rankings drops, such as the ones from position 1 to position 7).
-When Google was alerted to the issue, they took a closer look and on manual inspection found not only additional problematic links but also other spammy issues (if you follow the link in my story to the blog post by the guy who helped NYT with the investigation, you'll see that the SEO firm set up doorway pages and that the jcp pages themselves have keyword stuffing and hidden links on them). Based on that manual review, Google added a manual penalty to the site.
That's why my conclusion is that once they fix the issue, the manual penalty will be removed and they'll rise a bit in ranking position. But since the algorithmic penalty simply (I'm speculating) caused some of the paid links to be devalued, there would be no "lifting" of this penalty.
It is very disheartening that something so vital to business success (understanding how to operate online; build a web site with good site architecture; engaging with searchers; solving their problems) is so much equated with these types of tactics.
vanessafox | 15 years ago | on: Broken Links
http://searchengineland.com/google-proposes-to-make-ajax-cra...
http://searchengineland.com/googles-proposal-for-crawling-aj...
http://searchengineland.com/its-official-googles-proposal-fo...
Of course, a better solution is some type of progressive enhancement that ensures both that search engines can crawl the URLs and anyone using device without JavaScript support can view all of the content and navigate the site.
vanessafox | 15 years ago | on: Google: Bing Is Cheating, Copying Our Search Results
vanessafoxnude.com has been redirecting to my current site for several years now, but back when the original site was active, much of the incoming anchor text was related to Google and search.
In that case, Google Webmaster tools is not actually reporting an error. That's a report to show you what URLs Google tried to crawl but couldn't (due to being blocked) so you can review it and ensure that you are not accidentally blocking URLs that you want to have indexed.
I agree that it's confusing in that the report is in the "crawl errors" section.
(I built Google webmaster tools so this confusion is entirely my fault; but I don't work at Google anymore so sadly I can't fix this.)