I’m using the latest version of ghost and using Ghost Pro as a headless CMS
now when I try to load an image from my ghost blog it resulted in twitter saying it can’t load it because of the robots.txt file - I then tried to load the default ghost blog itself and it gave me this response in the twitter card validator.
ERROR: Fetching the page failed because it’s denied by robots.txt.
I am running into same issue.
It sounds like you’re using Ghost in “Private Site Mode” - which creates a robots.txt to prevent search engine indexing.
If you’re using Ghost headless, you should generally download images locally to your SSG and serve them from there. If you want to serve images from Ghost, rather than the SSG, then you will need to disable Private Site Mode - and take alternative steps to prevent search engine index, such as writing a custom robots.txt, or using a theme with no front-end output.
Thanks @John ! Can you suggest a tutorial or blog post for downloading the images locally to the SSG?
That would be completely different based on the SSG - probably a question for someone on the forum of the SSG you’re using