SEMrush Site Audit not crawling my subdomains

The domain I currently do SEO is blog.pantherprotocol.io
Here’s the thing:
site audit was always set up for pantherprotocol.io
and with that, it crawled not only pantherprotocol.io, but also the blog (blog.pantherprotocol.io) and the forum
On the most recent site audit, it only crawled pantherprotocol.io
Why? How can I make it crawl it all again?
I’ve seen somewhere that I may be able to set up a robots.txt file to my blog’s theme
but how can I do this?
The bot is GoogleBot-Desktop

You already have a robots.txt file provided by Ghost, and this doesn’t prevent bots crawling the site.

User-agent: *
Sitemap: https://blog.pantherprotocol.io/sitemap.xml
Disallow: /ghost/
Disallow: /p/
Disallow: /email/
Disallow: /r/

I suggest you reach out to the Semrush.