Google has sent out similar notifications to other webmasters via the Search Console, reminding them once again that not allowing Googlebot to access those files may result in “suboptimal rankings”.
The full warning reads:
This sounds bad, but fortunately the email comes with instructions to diagnose the issue as well as to check if you have fixed the problem, using the Fetch and Render tool within the Search Console.
An easy fix would be to look through your site’s robots.txt file and remove any of the following lines of code:
If this doesn’t do the trick and Googlebot is still being blocked, the Fetch and Render tool will provide further instructions on edits to be made to your robots.txt file. The robots.txt testing tool is also useful for detecting any other crawling issues.
As Google has made its message loud and clear, it’s a good thing you know whether or not your site is blocking Googlebot from accessing some content so you can deal with the issue and quite possibly boost your search performance in the process.