-
Notifications
You must be signed in to change notification settings - Fork 239
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Combination of Crawl-delay and badbot Disallow results in blocking of Googlebot #51
Comments
Hi Mojmir and thanks for opening this issue. Custom lines such as
VS
This means that Not ignoring custom lines such as Unfortunately the testing tool in Google Search Console, unlike Googlebot, is not using this library so we haven't gotten to fixing this obscure bug there, too. |
Hi Gary, thank you for your answer. I didn't know - the syntax of robots.txt is really a bit tricky. Unofficial rules (e.g. Having this on mind it is better to put unofficial rules at the end of file, especially if they are used with
...but not by robots.txt like this:
...even both seems to do the same thing. |
You're correct, lines that are not supported by Googlebot but are in a group otherwise like |
For example Googlebot gets blocked by following robots.txt (check it in google testing tool):
If you remove
Crawl-delay
directive Googlebot will pass. This works:And this too:
If you would like to use
Crawl-delay
directive and to not block Googlebot you must addAllow
directive:Both
Crawl-delay
andAllow
are unofficial directives.Crawl-delay
is widely supported (except of Googlebot).Allow
is supported only by Googlebot and Bingbot (AFAIK). Normally Googlebot should be allowed by all above mentioned robots.txt. E.g. if you choose Adsbot-Google in mentioned google tool it will pass for all. All other google bots will fail in the same way. For first time we have noticed this unexpected behaviour at the end of 2021.Is this a mistake in parsing of robots.txt by Googlebot or do I just miss something?
The text was updated successfully, but these errors were encountered: