SEO: Why are tag and author pages not 'noindex'? And how to make them

Hi,

Ghost is good for seo but our seo software keeps flagging the author and tag pages as low quality due to the duplicated content. So I understood its better to noindex these pages but why isnt Ghost doing this automatically?

And how to do this?

There are two ways you can mark content as noindex.

  1. You can upload a custom robots.txt file with the appropriate disallow rules

  2. Customize your index and author templates to set a <meta name="robots" CONTENT="noindex,nofollow"> tag using the {{#contentFor}} helper

As for why this isn’t done automatically, I think it’s likely because while these pages are showing up for you as low quality in terms of SEO, that won’t be the case for everyone. Since the author template is totally customizable, it’s possible that some users might have extremely rich content on these pages.

There is no penalty for duplicate content.

1 Like

okay first of all, robots.txt is not for marking pages as noindex.

second, there is a penalty for duplicate content. whether you want to believe it or not. and even if there isnt a direct penalty, there are 2 indirect ones - keyword canniablization, and wasted crawl budget.

if ghost doesnt have a way to properly set pages (in bulk) to noindex, then it needs to stop saying “its good for SEO” bc this is basic stuff that it doesnt seem to accomodate.