Robots.txt Override Doesn't Work

I’m on 3.34.1 (this is my url) and having issues overriding the default robots.txt by adding my own file to the theme. I’m trying to update my robots to txt to this:

All robots allowed

User-agent: *
Disallow:

Sitemap

Sitemap: https://www.womaninrevolt.com/sitemap.xml

But it still shows as this:

User-agent: *
Sitemap: https://www.womaninrevolt.com/sitemap.xml
Disallow: /ghost/
Disallow: /p/

A bunch of my pages got hit with a “blocked by robots.txt” error in search console and I think it’s because of the /p/ disallow.

Any help would be greatly appreciated!