Limit Your Site Crawling With Search Console

Google has complex algorithms to decide the ideal crawl speed for a site. Google's objective is to crawl the maximum number of pages from a site on each visit without overpowering a server's bandwidth. 

In case Google is making too many requests every second to a site and hindering a server, you can confine how quickly Google can crawl your site. The crawl rate that you choose is the maximum crawl rate that Google-bot should perform. Note that it doesn't ensure that Google-bot will hit the same crawl rate. 

You can confine the crawl rate for root-level sites
—for instance, 
www.abcd.com and http://subdomain.abcd.com. 

You can't change the crawl rate for websites that are not at the root level
—for instance, 
www.abcd.com/organizer. 

Google advises against restricting the crawl rate except if you are seeing server load issues that are certainly effected by Google-bot hitting your server excessively hard. 

How To Limit The Crawl Rate 

-Open the page https://www.google.com/webmasters/tools/settings

- There you will find two options:-

  • Let Google optimize for my site (recommended)
  • Limit Google's maximum crawl rate

How To Limit The Crawl Rate


- Choose the second option (Limit Google's maximum crawl rate).

- And confine the crawl rate as needed. The new crawl rate will be admissible for 90 days.

Previous
Next Post »