So, how does this Googlebot actually work?
Rolf Broer, who works at Onetomarket, just released a very interesting article about Googlebot and its crawling capabilities. Rolf created several tests for the so called superior bot of Google to see how it reacts on particular setups of links and pages. Biggest takeaway is that you can use a Google sitemap to boost the crawlrate. But there is more...
Google's statement about the fact that you shouldn't use more than 100 links per page doesn't make it. The test showed that Google will crawl way over 100 links if it has to. Also Matt Cutts statement that "the amount of pages being crawled is roughly proportional to your PageRank" seems to be a tad of course. In 31 days, Googlebot visited about 375.000 pages on a PageRank 0 website. Rolf declares that when the website would have had a PageRank of 1, Googlebot would crawl over 140.000.000.000 pages in 31 days. That simply means that PageRank doesn't matter at all to get your pages crawled.
Tagcloudlondon android spain yahoo twitter news search engines sea streetview browser event advertising social media ppc iphone realtime ses ads sem indonesia searchcowboys website facebook youtube funny a4uexpo analytics microsoft business matt cutts search engine blogger russia video images baidu research blog marketing privacy