Dev License: This installation of WHMCS is running under a Development License and is not authorized to be used for production use. Please report any cases of abuse to abuse@whmcs.com

Search engines use "crawler robots" to index your web pages and their content. In some cases, the requests from these robots can be overwhelming for your website. Depending upon how your website is built/optimised and the resources of the hosting plan you've purchased, a flood of requests could cause the website to temporarily exceed its available CPU/memory resources, or could cause the website to slow down significantly for normal web visitors.

Many search engines like Bing and Yahoo! support REP (Robots Exclusion Protocol), and one part of this protocol is that the crawler robots will look for a file named robots.txt and follow any instructions within that file. The file can easily be created by any website owner or web developer and placed in the main website directory (usually public_html). More detailed information about the robots.txt file can be found at The Web Robots Pages (robotstxt.org).

In the robots.txt you can add one simple line to instruct all crawler robots to slow down:

Crawl-delay: 1

Google Bot

The Googlebot ignores the "Crawl-delay" directive. It's recommended to register your website for Google Search Console Tools to adjust the crawling rate and other settings.

For more information, please read the official documentation: Change Googlebot crawl rate - Search Console Help

Bing Bot

You can instruct the Bing bot to crawl your website more slowly as follows:

User-agent: bingbot
Crawl-delay: 1

The Bing crawling robot accepts values of 1 (slow), 5 (very slow) and 10 (extremely slow).

Alternatively, a website owner can register for Bing's Webmaster Tools and manage their website's crawl rate, here: Bing Webmaster Tools

Yandex

This search engine can quite aggressively crawl websites and is often responsible for causing website downtime. Thankfully, you can set a timeout value in seconds, so it will take a 2, 4, 6, 8 second break between each request, for example:

User-agent: Yandex
Crawl-delay: 4

Most Yandex users are in Russia, so if your website does not have an audience in Russia, you could consider blocking the robot altogether, like this:

User-agent: Yandex
Disallow: /

Updated by SP on 23/11/2022

Was this answer helpful? 3 Users Found This Useful (3 Votes)

Powered by WHMCompleteSolution