Crawl Budget

Updated on: 2022-02-17

Name: Crawl Budget

Referred to as: Crawl Budger

Category: On-Page SEO, Technical SEO

Correct Use: There is not really a correct use of Crawl Budget – don’t waste it though!


Crawl budget refers to the number of pages Googlebot or other search engine robots crawl a website in a certain period of time. 

When your budget has been blown, the web crawler will quit getting to your webpage’s content and continue on to different destinations. Generally for most smaller sites, Crawl Budget is not an issue, but larger sites can be impacted and throttled by the crawl budget. An increasing crawl rate is generally seen as a good sign, and you want Google to visit your site as frequently as possible in most cases.

Crawl Budget
Crawl Budget

Our take:

Crawl Budget is the number of pages Googlebot crawls on a site inside a given time period.

Crawl Rate Limit

The page speed, crawl errors, and the crawl limit set in Google Search Console (website owners can maintain Googlebot by an option) are responsible for the crawl rate limit.

The crawl rate limit helps Google to set up the crawl budget for a website. Bing Webmaster Tools gives you specific control over the crawl (See below).


  • Googlebot will crawl a website.
  • The bot observes settings and acts accordginly.
  • Finally, Googlebot will adjust the crawl rate. 
Bing Webmaster Tools Crawl Settings
Bing Webmaster Tools Crawl Settings

Crawl Demand

The main consideration for the crawl demand is the popularity of any pages on a website as well as how fresh or stale they are. There is a term named QDF – Query Deserves Freshness which can have an impact on search results. A similar thing is true for indexing, Google will naturally want to crawl and discover new URLs it does not know about. This area has been helped with the Index Now protocol that allows for almost instant crawling in some cases.

It is important to observe the crawl status in Google Search Console under Settings -> Crawl Report, you will be able to see if Google has had any issues crawling your site and what the crawl stats look like.

Crawl Report Errors
Crawl Report Errors
Crawl Report
Crawl Report

We can say, there are  two factors may play a significant role in determining crawl demand:

  • URL Value: Popular pages or most valuable pages will get indexed more frequently than comparatively low one.
  • Staleness: Google’s algorithm always will prevent stale URLs as well as will benefit up to date content.

Now it will be clear Google basically uses both crawl rate limits and crawl demand to determine the number of URLs Googlebot wants to crawl.

Extra reading


Always be aware of crawl errors. You should identify the reason and try to resolve any issues.


Have a lot of low-value URLs on a site that eat up crawl budget.


Google always considers the stability of every website. So, it is important to choose a stable server as well as a perfect hosting solution. Always keep in mind, Googlebot will not continually crawl a site that crashes constantly.

Subscribe for more SEO Jargon and SEO Tips

Sign up for our newsletter and stay up to date



What Google Says: