Hi, I have come across a robots.txt file where:
The company is disallowing the Nutch bot.
They are implementing a crawl delay of 10 seconds for AhrefsSiteAudit.
There is a crawl delay of 10 seconds for MJ12bot.
A crawl delay of 1 second is set for the Pinterest bot.
In my understanding, this suggests that the company is aware that these specific bots may be causing server loading issues. If not by Following their logic, it raises the question of why they are not implementing similar crawl delays for other bots, such as Semrush, etc… considering they have chosen to delay the crawl for Ahrefs bot.
And generally what do you think about this kind of robots.txt file. all other things are done correctly, I have not copied the whole file
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://ift.tt/kTKnlGs
Comments
Post a Comment