I’m on 3.34.1 (this is my url) and having issues overriding the default robots.txt by adding my own file to the theme. I’m trying to update my robots to txt to this:
Not really. After a while (maybe 30 minutes to an hour) the change finally registered and I could see my updated robots.txt. I wondered if maybe the cache needed refreshed or something, although I expected the change to take place instantaneously. After removing the /p/ disallow and forcing a new crawl, my errors in search console were likewise resolved.
Removing the `/p/ disallow will allow search engines to crawl and index your draft posts with their draft URLs before they are published. That is rarely desirable.
I’d suggest a deeper look into the errors, rather than removing /p/ from the disallow list.
Depending on how you are hosted, you may need to wait for the cache to clear on your robots.txt - it’s a rarely updated but highly requested file, so caching it is a good thing. If you’re on Pro, you can contact support to confirm that your robots.txt has been updated.