How I Automated a +350-Post Dream Blog with OpenClaw + Ghost

I run an SEO experiment with meaninginadream.com, a dream interpretation blog on Ghost. Over the last few weeks I’ve built a fully automated content pipeline using OpenClaw. An AI agent framework that connects to Ghost’s Admin API, image generation, search APIs, and social media. And the setup for the agent takes only 1 hour!

Thought I’d share the setup since it might give other Ghost publishers ideas.

What It Does

I have 4 cron jobs running 24/7:

  1. Content Improver (every hour)

• Picks the oldest-updated post on the site
• Fetches real “People Also Ask” data from Google via Serper API
• Generates 2 custom images using Ideogram AI (converted to WebP for performance)
• Rewrites the post with proper SEO structure: answer capsule, FAQ from real search data, internal links, comparison tables
• Pushes the update via Ghost Admin API
• Submits the URL to IndexNow (Bing/Yandex) and Google Indexing API

This means every post on the site gets touched and improved over time. Older thin content gets upgraded automatically.

  1. New Post Creator (every 6 hours)

• Researches a hyper-specific dream topic that doesn’t exist on the site yet (e.g., “Dream About Peeling Wallpaper Off Walls” — not generic “falling dream”)
• Checks Google competition via Serper to find low-competition niches
• Generates 4 custom images per post (cover, psychology, scenario, coping)
• Writes a full 1500-2000 word post with proper E-E-A-T signals
• Publishes directly to Ghost and submits for indexing

  1. Pinterest Autopinner (every 3 hours)

• Detects new/unpinned posts
• Generates SEO-optimized pin descriptions with relevant hashtags
• Uploads feature images and creates pins on the correct board based on post tags
• Uses Postiz (https://postiz.com/) CLI for the Pinterest API

  1. X/Twitter Autoposter (every 3 hours)

• Same detection logic as Pinterest
• Writes engaging tweet-length hooks for each post
• Posts with the feature image attached

The Stack

• Ghost — CMS and hosting
• OpenClaw — AI agent orchestration (cron jobs, tool access, Telegram notifications)
• Grok 4 — LLM doing the actual writing/editing
• Ideogram v3 — Image generation (all images converted to WebP, ~60-120KB each)
• Serper API — Google search data for PAA questions and competition checks
• Google Indexing API + IndexNow — Fast indexing after every publish/update
• Postiz — Social media distribution (Pinterest + X)

Content Quality Approach

Every post follows a strict structure:

• Quick answer capsule at the top (great for featured snippets)
• H2s written as actual user questions
• Tables for scenario comparisons
• 7-10 FAQ items sourced from real Google PAA data (never invented)
• Internal links to related posts
• Named author with first-person voice for E-E-A-T

Results So Far

• Started with ~314 posts, now at 350+ and growing daily traffic +150 unique visitors
• Every post tagged across 20 categories
• All images optimized (WebP, under 200KB)
• Full social distribution pipeline running hands-off
• Site gets re-indexed within hours of any update

Lessons Learned

  1. Ghost’s Admin API is solid. JWT auth, HTML source mode, image uploads — everything works reliably.
  2. Image optimization matters. Ideogram generates ~1.2MB JPGs. Converting to WebP (quality 80, max 1280px) cuts that to ~60KB with no visible quality loss.
  3. Real PAA data > invented FAQs. Google knows when FAQ content matches actual search intent vs. filler.
  4. IndexNow + Google Indexing API together get pages indexed fast. Don’t rely on just one.
  5. Keep AI agent prompts short. Long prompts with inline code cause the LLM to leak raw API output. Moving logic to helper
1 Like

I understand AI can be a helpful tool. But you basically created an automated AI slop pipeline.
It has no soul.

9 Likes

Ruining the internet and the world to make a small profit is one thing . Bragging about doing it is next level pathetic.

2 Likes

I take some of your insights about using IA but definitely don’t take the idea about killing human interaction on Internet.