Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

To reduce server load, could follow crawl-delay directions in robots.txt #5

Open
GoogleCodeExporter opened this issue Jun 30, 2015 · 1 comment

Comments

@GoogleCodeExporter
Copy link

It would be great if the bot could follow the Crawl-delay extension to the
robots.txt protocol to avoid overloading a server.

Original issue reported on code.google.com by [email protected] on 16 Apr 2010 at 1:44

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant