A few days ago, a member of Hacker News has posted that the site is no longer ranking well in Google for the website’s name (at the first positions). This was pretty surprising because Hacker News is tremendously popular and reputable. Good thing the site is attracting many of the top geeks online…
If you aren’t familiar with Hacker News, it is basically an online community of some of the most serious web professionals that recommending and discussing all kinds of issues, links and topics from the web. It has a very basic structure with minimal design which makes it very simple.
Luckily for Hacker News, This online community also includes few of the major Google’s search engineers like Matt Cutts (head of search webspam team) and Pierre Far (webmaster trends analyst) that tried to figure out what happened to the site’s rankings during the weekend.
So what indeed happened to the site’s rankings? It hasn’t been penalized and neither Pandalized… This time, it was the webmaster own fault: He mistakenly blocked almost all Googlebots when he tried to optimize the website performances. That caused Google to treat the site “as a dead page” (Matt Cutts quote).
I think that there is a big lesson here for all of us- There are times we rush to blame Google for rankings and traffic drop, when the responsibility is upon us. Matt gave an example of defining wrongfully the crawl rate in Webmaster tools and causing the Googlebot to crawl pages from the site “every five years” (which obviously is a bad thing).
So the next time (hopefully, never) you suffer from a drop in rankings, don’t immediately blame Google for this, because it may be your own fault and you should check and rethink about the recent actions you made- For instance, changing things in Webmaster Tools, defining robots.txt or meta tag robots differently or even change your links to no-follow. All these stuff can cause problems with crawling and indexing of your site.