Add to Favourites Add to Favourites    Print this Article Print this Article

Search engine crawlers are increasing my system load

Since a search engine like google need to parse your website to determine what to search for, if your website has a lot of data, this can often cause a high load on your system if the crawl is done in a short amount of time.

By creating a robots.txt file in your public_html folder, you can instruct these crawlers to slow down.

A sample robots.txt might look like this:

User-agent: *
Crawl-delay: 300

Which tells all crawlers to wait 300 seconds before each request.

Without it, a cralwer might make multiple requests per second, thus increasing your system load.

Was this answer helpful?

Also Read