What’s your URL? https://ghost.optezo.com
What version of Ghost are you using? Latest, 3.37.1
How was Ghost installed and configured? Via CLI, ubuntu 18.04
What Node version, database, OS & browser are you using? Node 10.23.0, mysql local
What errors or information do you see in the console? None
I’m able to successfully put a custom robots.txt file in the base of my theme and have that file used, but I’m trying to run ghost headless, using the API from another site. When I make the site “private”, the robots.txt automatically turns into
User-agent: * Disallow: /
and my custom robots.txt file isn’t use. I need robots to access the /content/images folder, or things such as twitter cards and the like will not work. These agents will honor the robots.txt file and will not populate images.
How can I add a custom robots.txt file for when the site is in private mode? I can’t find anything in the docs or forums related to this. Thanks!
btw, if you look now at my robots.txt on that site, you’ll see that it’s different, but that’s just because I edited the file at
I know this isn’t a long term solution, as it’ll stop working on the next software upgrade. Hopefully someone has a good solution for this. Thanks!
Hi, is there any way to do this currently in ghost?
Were you able to find a longer term solution to this?
I was not able to find another solution. This is just part of my upgrade process to manually update this file.
If your setup has ghost running behind nginx or any other reverse proxy, then you can configure it to serve
/robots.txt as a static file from a custom location instead of asking ghost to serve it.
I have done a similar configuration using Caddy for my staging environment.
Excellent idea, srijan. Why didn’t I think of that. Thanks for replying.
Are there any other workaround to this for if we don’t have a Ghost installation of our own? I’m currently using Ghost Pro.