These page is not indexed by Google, and it gets blocked while accessing it via API.
Here is the link: https://www.bizappln.com/blog/ghost/api/content/posts/
Kindly check from your side and resolve this issue ASAP!
Hope to hear back from you.
The 403 means the request is disallowed by robots.txt. This is correct behaviour since your (and every) robots.txt has
I didn’t add Disallow: /ghost/ in our robots file. What do you recommend me to do?
You don’t need to do anything. This is correct behaviour, as you don’t want robots crawling the /ghost/ folder.
So it won’t affect the indexing of the blog posts,right?
No. Your robots.txt is instructing search engines to crawl https://www.bizappln.com/blog/sitemap.xml.