How to extract a list of Page titles & Slugs From a Ghost Site

Anyone have a trick for this? I have 100 posts and will eventually have alot more. I want to build an external list of my content for other purposes so I need to pull down the page title and slugs for all my content.

What I tried:

  • export content to json (what I need is in the json file, but it’s already quite large)
  • online json parser/explorer tools (was not able to generate a clean list of the 2 things I need). One tool had a json query functionality but I was not able to get it pulling what I needed.

Other ideas:
Crawler tool?
Maybe an easy to use point and click json explorer?
Some clever macros in notepad ++
Copy paste to excel?

Ultimately what i want are 2 lists. One list of my page titles and a corresponding list of matching page slugs for each post/page. The process should be something i can repeat myself in the future. Any suggestions?

I think i did it once before but I can’t remember how, maybe it was the RSS feed…but RSS feed seems to be limited to a certain number of recent posts.

Unless I’m misunderstanding, the API can do all the heavy lifting for you :cowboy_hat_face:

Here’s an example (that works):,slug

I used posts instead of pages in the example since it looks like the Gatsby demo site doesn’t have any pages

1 Like

That is exactly what I needed. Thanks for your help again Vikas!

1 Like

One more thing. That command pulled only 15 posts, can i add an argument to pull 100 or all?

Nevermind I found it in the documentation &limit=all

For this, we created a Python script utility so that our not so tech savvy team members could work freely. This depends on export of the blog for now. GitHub - mvihousekeep/ghost-export-utility: An Excel Export Utility to Convert JSON export from Ghost CMS to a neat Excel