Google?s ?Big Daddy? Purportedly Causing Havoc With Page Rankings
October 12th 2015 Posted at Uncategorized
Comments Off on Google?s ?Big Daddy? Purportedly Causing Havoc With Page Rankings
Bigdaddy, Google’s new data centre, isn’t news to most webmasters; both Search Engine Watch and Webmaster World’s forums have discussed the technology since late 2005, and even Google’s own Chief Search Engineer Matt Cutts has blogged the topic extensively. Even the reason behind the naming convention (one of the Google staff’s kids call him Big Daddy at home) is out in the open.
How Bigdaddy will affect page rankings within Google however, still remains to be seen, although there is a lot of speculation floating around the Net. Mr. Cutt’s blog states that the new foundation will improve cannonicalization, which is the computer code that tells a search engine that
www.domain.com
domain.com
domain.com/index.html
www.domain.com/index.html
are all the same web site. It is also reported that the data center will positively impact 302 redirects, which have been a known issue for some time.
What wasn’t anticipated with the update though were the chaotic and oftentimes strange behaviors the search engine has displayed, most notably over the past several months. Although some of Google’s previously-indexed sites dropping off the face of the engine may be associated with their reported lack of server space and others due to Google’s “different datacenters get different data at different times” statement, many of the problems seem surreal, without explanation.
For instance, Webmaster World’s forums have reported large SERPS drops, changes in supplemental result handling, “home page only” results and pages dropping right out of the Bigdaddy index, while the Digital Point forums are asking if Google has cleaned their index and why the supplemental problem is reoccurring.
What little information is available on the subject is only the information that webmasters are providing each other; little explanation is coming from Google themselves right now. Some experts have shown that the supplemental deletions across the data centers have been gradual over the past several months, with the idea that perhaps the supplemental results are being deleted to free up server space. Others have noticed the inclusion of longer URLs with multiple variables, such as database-driven pages (which were not previously indexed), and the product-based sites like BizRate and Amazon generating higher search results than previously found.
Right now, the only “fixes” seem to be either contacting a member of the Google team (most notably posting in Mr. Cutts’ blog), ensuring your website doesn’t fit within the “too similar” Google guideline (as it seems that sites with slightly different page text are doing better than most) or hitting up Google’s site for a re-inclusion request. Without more guidance from Google though, there isn’t much webmasters can actively do at the moment, other than sit back and watch Bigdaddy work out the search engine-retrieval bugs, talk over the situation with other webmasters, and stay as informed on the subject as possible.
The Blog Market
Related Google Webmaster Tools Articles
Both comments and pings are currently closed.