Wrong robots.txt

Suddenly I am having this problem, I only noticed it because I saw that the google search console does not scan my site since October 23rd.

robots.txt in the template root is:

User-agent: *
Sitemap: https://domain.it/sitemap.xml
Disallow: /ghost/
Disallow: /p/

It has always worked flawlessly.

Now if I try to connect to https://domain.it/robots.txt I see:

User-agent: *
Disallow: /

I tried both incognito and through a browser that I have never used.

The site is not private and I initially disabled a “Purge Everything” on Cloudflare, now I have completely disabled it.

I don’t understand why Ghost generates me a robots.txt with Disallow: /

Ghost * Version: 5.20.0

After 9 hours I double-checked everything, the robots.txt inserted in the root of the template is correct, cloudflare is set in developer mode so the cache system is disabled.

the robots.txt shown by Ghost is still:

User-agent: *
Disallow: /

Updated to version 5.21.0, the problem persists