Ghost Memory Consumption

Anyone ever ran into memory consumption issues with Ghost/NodeJS?

I’m running 6 instances (only 1 has any relevant traffic, the rest are very small personal blogs) on a DO droplet (1vcpu, 2gb ram) and it oftens hits the 85% monitoring threshold and emails me about it, while also dropping below it a few minutes later.

The instance was initially a 1vcpu, 1gb ram one, which I upgraded after installing a few more Ghost instances. I thought this might be some leftover issue from the droplet upgrade (they do have some warnings about space not properly upgrading sometimes), but I didn’t get around to migrate it to a newly created droplet of this size to exclude this option.

Services running on the droplet: node, php, nginx, redis, mysql, fail2ban, postfix, etc. Cannot really identify a culprit, looking at htop most processes don’t consume a lot of memory and they all jump up and down in the list showing in the top 10 most resource consuming processes.

As a reference, I run the same DO droplet (1vcpu, 2gb ram) with some small Wordpress sites, same tech stack (except node) and yet the memory consumption is one third of the droplet with the Ghost sites.

Any other ideas I could explore?

Thanks!

Maybe it’s just a kinda Glitch?

@dsecareanu are you able to correlate the memory usage with sites that receive a lot of image uploads?

We’re aware that in certain situations the default memory allocator in Ubuntu can have issues where memory used by the image processor is not freed back to the OS. This can be resolved by using jemalloc as the memory allocator.

For reference we see typical memory usage for reasonably sized sites sitting at ~180-200MB. For very large sites (10s-100s of thousands of posts/tags) this can obviously be higher.

This could be the case (though rather huge library of images and not so much lots of image uploads). The blog with the most traffic also had some issues with theme freezing due to huge image size (almost 1gb of images indexed by the search theme search engine). This has been optimized a while ago, reducing size of images down to 3-400mb, but I haven’t checked lately to see if the size of images grew again out of control. Will check this avenue though, thanks.