We have published and draft posts, but it’d be nice if we also had unlisted posts (they have the slug, but don’t show up in rss/listings).
The Post model already has a visibility field, the only thing needed would be an option to toggle it from the admin, or even a regex to detect #post-slug as an “internal” or “unlisted” post with slug “post-slug”
Well, yeah, kinda, but decided to make a different post because I’m talking about the technical part
As I’m focusing more on SEO these days, it would be nice to have a way to tell Ghost that search engines should ignore this page/post.
Or maybe it’s already possible?
You just put
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW"> in the post head code injection, no?
Oh great! Is there an official documentation regarding this? It would be a great solution :)
Another option that can be done especially if this is something that needs to be done to multiple posts is use internal tagging in the theme so basically putting something like this in the head of the theme
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
that way all you need to do is just add the tag with works best with people that dont understand the concepts and just follow the steps of making a post. its clean and simple
While there are workarounds as mentioned, I believe a blog CMS claiming to have full SEO functionality should really have this in the admin panel with relevant hook or function themeside.
This is true since I noticed that even if you put the meta tag you cant really edit the Sitemap to reflect the changes.
So without a proper sitemap, the post will be crawl anyways right?
Every webpage in SEO have to be Indexed and Crawl.
Indexing means the SEO crawl it and place in the search engine result page (SERP) whereas the Crawl means the SEO keep track the webpage to be indexed.
The Option “NOINDEX” will ignore your page for search results in SEO.
“Disallow” used not to crawl but index in the search result page.
Hence, choose the above mentioned option to ignore your page from SEO
Hmm @cstpl123 I don’t think you are correctly using the terminology here, what you’re saying doesn’t really make sense.
@pascalandy typically the post will be crawled (eventually) if Googlebot can find a way to get there AND the URL is not blocked in robots.txt
The main way Googlebot can find a way there through one of these methods:
- sitemap.xml (post URL is present)
- there are internal links, redirects or canonicals pointing to the post URL.
- there are external back-links, redirects or canonicals pointing to the post URL, the URL has been crawl-requested via GSC.
Separately to the crawl phase is the concept of indexability - whether something shows up at all in the Google “index” of possible search results, or is the URL specifically excluded from the “index” and can’t be returned as a search result.
If the post URL is crawled by Googlebot, it will then be checked by to see if it is indexable. In my experience, indexability is a sliding scale - something can be more or less likely to be indexed - some things that get checked per URL:
- has a HTTP200 response
- is canonicalised to another URL
- is a soft 404,
- has a “noindex” tag in HTML
So to answer your question directly - if everything else is setup “normally” then yes, the post should be crawled just fine without being in a sitemap. However there are other considerations here as well.
Thanks for jumping in.
- has a “noindex” tag in HTML
Do you agree with the solution I flagged?
I agree with the fact that the theme should not manage this as we change themes over time.
Quote (email exchange):
This is good when you have your theme, but it is wrong in the themes to sell.
The theme is just a graphic overlay for Ghost. The function you provide should be built into Ghost as a checkbox “hide this post” (as featured this post).
People like to change themes, when you add a function in a theme and change it, you will lose these settings.
Therefore, it is better to use code injection in the post options. It’s just my opinion