Increase the performance of your self-hosted ghost blog massively with proper caching in Cloudflare

Yes, I have the Worker running now. I just deactivated Rocket and Minify. I also purged the cache. So far, everything seems to be fine.
On your site it had also happened to me on Home from mobile.
It’s funny because it happens only sometimes, and in different parts. You enter at another time and it works.

Yes, for that, I even changed theme to source from casper. And I modify the theme a bit so that was a headache but it still kept occurring.
I am not a engineer or professional web developer but if I am not wrong these features require jQuery to load properly ( again I am not sure) and probably rocket loader was loading jQuery asynchronously which might be the reason but again I am not 100% sure.

Update:

  1. Google Adsense works perfectly with this method.
  2. But Ezoic breaks. As ezoic too bypass the site through their own cloudflare network. But if cached it probably does not hit ezoic proxy.

I have to disagree again. Using workers is extremely naive considering the limits and the fact that Ghost doesn’t need any of that and is fine with normal config. Also didn’t you think about the fact that someone might decide to make millions of request to your website? You are gonna be billed, BIGLY! All PAYG cloud has this problem.

If you are using members features like login/sign-ups etc and with that using cache everything in page rule, then you will cache member pages and it definitely cause a lot of problems. This solution here is for self hosted blogs. And I said workers free tier, which in my experience will easily handle over 5000 page views a day. If I am getting over 5000 page views a day, I would not self host and let ghost pro manage my site in the first place.

And yes if on a particular day you get ddosed then there is an option so that when workers is exhausted it just bypasses it and works normally. So I wouldn’t worry too much, just would try to mitigate the situation with WAF etc.

This was particularly for people who has members enabled. Others don’t need this. If you check ttfb headers of any ghost pro blogs that have members enabled you will see sub 50ms latency, that is probably because fastly support VCL cache configuration at edge. So the logic is just like this workers method. They are probably using VCL at edge and we are using workers at edge. Because full page Cache with page rules doesn’t work on sites using members features and definitely breaks the site. For very high volume site one other way around is to set low front-end caching like 10sec to 60 sec in config.production.json (Good practice if you are using bunny or AWS).

@Abhishek_Prakash @Joan
A little modification to the workers script

addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request, event));
});

async function handleRequest(request, event) {
  const url = new URL(request.url);
  const cookies = request.headers.get('Cookie') || '';
  const hasMembersCookie = cookies.includes('ghost-members-ssr=');
  const hasAdminCookie = cookies.includes('ghost-admin-api=');

  const isAdminPath = url.pathname.startsWith('/ghost/');
  const isPreviewPath = url.pathname.startsWith('/p/');
  const isSitemapXml = url.pathname === '/sitemap.xml';

  const isCacheable = url.pathname.endsWith('.css') ||
    url.pathname.endsWith('.js') ||
    url.pathname.match(/\.(png|jpg|jpeg|gif|svg|webp)$/i);

  if (hasMembersCookie || hasAdminCookie) {
    if (isAdminPath || isPreviewPath || isSitemapXml || !isCacheable) {
      return fetch(request);
    }

    const cache = caches.default;
    let response = await cache.match(request);

    if (!response) {
      response = await fetch(request, { cf: { cacheTtl: 1209600, cacheEverything: false } });
      event.waitUntil(cache.put(request, response.clone()));
    }

    return response;
  }

  if (isAdminPath || isPreviewPath || isSitemapXml) {
    return fetch(request);
  }

  if (request.method === 'POST') {
    return fetch(request);
  }

  const cache = caches.default;
  let response = await cache.match(request);

  if (!response) {
    response = await fetch(request, { cf: { cacheTtl: 1209600, cacheEverything: true } });
    event.waitUntil(cache.put(request, response.clone()));
  }

  return response;
}

Now you do not need any other steps i.e. no cache rule or a cache bypass rule is required. only deploy this worker’s script and add the worker’s route.

1 Like

Perfect!! Thanks for your work!

@smartgoat
Many pages are shown as EXPIRED and others as HIT.
With your previous method, in Cache Rules I had configured Edge TTL = 1 month and Browser TTL 2 hour. Now when I was disabled cache rules… maybe this is why?

I checked hds+ and got all hit

1 Like

Another update. Now you don’t need Workers anymore. So no page restrictions. All size website an deploy this easily for free just with cache rules. Cloudflare has made cache bypass on cookie available for free tier and it is no longer limited to enterprise customers. Here is the full guide-