So, how does this Googlebot actually work?
Rolf Broer, who works at Onetomarket, just released a very interesting article about Googlebot and its crawling capabilities. Rolf created several tests for the so called superior bot of Google to see how it reacts on particular setups of links and pages. Biggest takeaway is that you can use a Google sitemap to boost the crawlrate. But there is more...
Google's statement about the fact that you shouldn't use more than 100 links per page doesn't make it. The test showed that Google will crawl way over 100 links if it has to. Also Matt Cutts statement that "the amount of pages being crawled is roughly proportional to your PageRank" seems to be a tad of course. In 31 days, Googlebot visited about 375.000 pages on a PageRank 0 website. Rolf declares that when the website would have had a PageRank of 1, Googlebot would crawl over 140.000.000.000 pages in 31 days. That simply means that PageRank doesn't matter at all to get your pages crawled.
Tagcloudtwitter europe android ppc business social media google earth search engines smx sem interview analytics maps realtime website spain google china event russia a4uexpo news social youtube advertising search engine strategies microsoft seo blogger gmail linkbuilding internet yandex search google wave images facebook search engine ads yahoo