Robots.txt Override Doesn't Work

I’m on 3.34.1 (this is my url) and having issues overriding the default robots.txt by adding my own file to the theme. I’m trying to update my robots to txt to this:

All robots allowed

User-agent: *
Disallow:

Sitemap

Sitemap: https://www.womaninrevolt.com/sitemap.xml

But it still shows as this:

User-agent: *
Sitemap: https://www.womaninrevolt.com/sitemap.xml
Disallow: /ghost/
Disallow: /p/

A bunch of my pages got hit with a “blocked by robots.txt” error in search console and I think it’s because of the /p/ disallow.

Any help would be greatly appreciated!

I’m having the same issue (with a different theme), did you solve this?

Not really. After a while (maybe 30 minutes to an hour) the change finally registered and I could see my updated robots.txt. I wondered if maybe the cache needed refreshed or something, although I expected the change to take place instantaneously. After removing the /p/ disallow and forcing a new crawl, my errors in search console were likewise resolved.

Oof thanks for answering! I’m leaving my robots.txt as this:

User-agent: *
Sitemap: https://www.syntherest.com/sitemap.xml
Disallow: /ghost/

Just getting rid of the /p/ stopped the errors, or did you change anything else?

Thanks a lot for your time!

have you fixed this issue?

I think it’s because of the /p/ disallow.

Removing the `/p/ disallow will allow search engines to crawl and index your draft posts with their draft URLs before they are published. That is rarely desirable.

I’d suggest a deeper look into the errors, rather than removing /p/ from the disallow list.

Depending on how you are hosted, you may need to wait for the cache to clear on your robots.txt - it’s a rarely updated but highly requested file, so caching it is a good thing. If you’re on Pro, you can contact support to confirm that your robots.txt has been updated.