Automatic backup

After reading the documentation, I understood that the proper way (today) to backup a ghost instance on a VPS is:

  1. Export JSON file
  2. copy the content directory

For the first one, there is two ways: using the administration interface and with ghost-cli (command line).

As I would like to do auto-backups, I would like to use the second one. But it’s interactive and so I cannot script this command:

iero@xxx:/opt/ghost$ ghost export ~/ghost_backup.json
+ sudo systemctl is-active ghost_ghost-iero-org
? **Enter your Ghost administrator email address** xxx@xxx.xx
? **Enter your Ghost administrator password** *[hidden]*
Exporting content

Is it possible to pass user/pwd in the command line ?
Or maybe bypass the autorisation for this simple ‘read’ action ?

If not possible, I will dump the database as usual. But it could be a good improvement for a future update.

The Ghost CLI is just a wrapper around the Ghost API which is why it needs your username and password:

task source

export logic:

1 Like

@iero I’m interested in what you found in the docs that indicated an export is a proper backup strategy. Can you provide a link?

An export is not a substitute for a backup, it’s designed to move content (posts, pages, members) etc from one instance of Ghost to another but it will not include certain settings or config for security reasons and also won’t include anything where the data can grow to be quite large such as your historic member and email stats.

For a proper backup you should follow standard practice for backing up files and database content, there shouldn’t be anything Ghost specific required.

1 Like

Sorry @Kevin for the confusion, I thought an export/import was a convenient way to do a backup without dumping the database.

But ok, I’ll do that and a rsync of content directory!

Thanks for your reply!

By the way, this was the post (from Ghost staff) where I found that import/export was the good way to go:

Is this the right way to backup the DB?

#!/bin/bash

GHOST_DIR=/opt/ghost
GHOST_CONFIG=config.production.json

GHOST_BACKUP_DIR=~/

ghost_db=$(jq -r '.database.connection.database' ${GHOST_DIR}/${GHOST_CONFIG}) 
ghost_user=$(jq -r '.database.connection.user' ${GHOST_DIR}/${GHOST_CONFIG})
ghost_pwd=$(jq -r '.database.connection.password' ${GHOST_DIR}/${GHOST_CONFIG})

mysqldump --user=${ghost_user} --password=${ghost_pwd} --databases ${ghost_db} --single-transaction  --no-tablespaces | gzip > ${GHOST_BACKUP_DIR}/ghost_db_`date +%Y_%m_%d_%H%M`.sql.gz

Hi Kevin, I came here with the exact question as @iero … how to (safely) pass credentials to the ghost-cli export command.

I recognize this doesn’t include the [site]/content directory so it is technically incomplete … but with this and a one-line tar command, it is super easy to create a backup that can be re-implemented with another instance of Ghost, which is what I think I want.

An export is not a backup in the traditional sense. It’s intention is to be used for migrating content, it’s not a complete copy of your database so it won’t function as a restore point in the way a backup would. Importing will also modify ids and re-render contents so again, not what you would expect from a backup/restore.

If you’re looking for a backup I suggest you look at mysqldump or similar.