I host many Ghost sites on a Linux (Ubuntu) server. All are working fine.
When is time to update… I go to each site (i.e. “cd /var/html/foo” run “ghost update”, then “cd /var/html/bar” run “ghost update”, etc.) and run “ghost update” on all of them. The ghost CLI dutifully does its job each time. No issues there.
I am wondering if there’s a more efficient way to update all sites at once… Or it has to be done this way due to the new version having to link for each site?
If you don’t want to go Docker, a file bash “for loop” saved in a file called /bin/updated-ghost-blogs will do. Something like this (untested):
#!/bin/bash
GHOST_DIRS="/sites/blog1 /sites/blog2"
for gdir in $GHOST_DIRS;
cd $gdir;
ghost update;
end;
Because this is still interactive, if it detects an permission with one, you can see that and deal with the output.
A fancier version of this script would download the new tar file once at the beginning, and then symlink into each directory in the expected location so you don’t have to download the new release N times.
If all your directories under “/sites” are Ghost blogs, you could also dynamically look them up:
Thanks for the quick response. I thought about the Bash option. Ideally I thought about a docker container… But I ran in the issue that Ghost wanted to have access to systemd, and it was not possible in Docker. That was about 1 yr ago or so. Not sure if that still the case; I have not tried Docker lately.
In other CMSs they had the option to have a dir with the code and then create symlinks to other sites. I was considering that option. But not sure if Ghost will be happy.
If Ghost is run in a Docker container, there’s no need to setup the systemd service, since Docker itself can manage making sure the service starts at boot and is restarted if it fails.
Personally, I prefer systemd for service management and prefer podman to manage containers when I need to and use Docker only when necessary.
Here’s a version of the script which allows customizing GHOST_DIRS through the environment:
#!/bin/bash
# Mass ghost upgrade…
# Run from directory above all your Ghost installs
# or set "GHOST_DIRS" to space-separated list of custom dirs
: "${GHOST_DIRS:=$(ls -d */)}"
for i in $GHOST_DIRS; { (cd "$i"; ghost update --no-restart) }
Within the for-loop, the parens create a sub-shell. The Eeffect is that the “cd” only affects the subshell, so when the subshell exits after each directory is visited, the “current directory” reverts the one of the parent process, which never changed. That’s why it’s safe to remove the steps to store and later restore the current $PWD.
Or set GHOST_DIRS in your .bash_profile to make the change permanent.
The script will still be somewhat slow and inefficient, but Ghost will download a new copy of the release each time. A speed-up could be gained by downloading the release once, and copying it into place for each install.
I haven’t looked closely how feasible or easy that is, but a lot of the time spend in the upgrade cycle are the “download and unpack” stages, which would be exactly the same for each install.