A robots.txt
file is a standard used by all websites to communicate with webcrawlers and tell search engines which areas of the website should not be crawled, to prevent overloading a site with requests.
All Ghost sites are configured with a default robots.txt
file which looks like this:
User-agent: *
Sitemap: https://site.ghost.io/sitemap.xml
Disallow: /ghost/
Disallow: /p/
This file prevents Ghost Admin and draft posts from being indexed, along with a link to your sitemap which helps search engines find and crawl your site.
Modifying the robots.txt file in Ghost
It's possible to override the default configuration by inserting your own robots.txt
file directly into your theme files, at the root level. In order to do this using Ghost(Pro), you'll need to be on the Creator plan or higher.
1. Go to the Settings → Design area in Ghost Admin, and click Change theme in the lower left corner.

2. Next, click on the Advanced button, in the top right corner, then click the "..." option next to the theme with the Active label, to select Download.

2. Once downloaded, extract your theme and create a new file named robots.txt
in the root level of the theme. The root level is the same level as all other .hbs
files contained in your theme.
3. Create your robots.txt
file using a free code editor such as Visual Studio Code and ensure you're using the correct syntax. For in-depth guidelines on creating sensible robots.txt files, refer to Google's robot.txt documentation.
4. Compress your theme files (Windows: Right-click theme folder, hover over Send to, select Compressed (zipped) folder / macOS: Right-click theme folder, select Compress).
5. Click Upload theme within the Settings → Design area of Ghost Admin and select the compressed file you just created.
That's it! Once your updated theme is uploaded it will overwrite the default robots.txt
file and serve your custom one instead.
Common questions
Why is the robots.txt file not updating?
This is likely due to a caching issue in your browser. Clear your browser cache to fix this issue. If you've cleared your cache and the issue persists, get in touch with us and we'll help get things sorted.
How do I know if my robots.txt
file is correctly formatted?
Validate the formatting of your robots.txt
file by using Google's robots.txt tester. This highlights any syntax errors or rule issues in your file and tell you how to fix them.
Do I need to resubmit my robots.txt
file to search engines?
No, unlike a sitemap, there is no way to submit a robots.txt
to the search engines. It takes between a few hours to a few days for the changes to reflect.
Can I use robots.txt
to keep certain pages from being indexed by Google?
Despite popular belief, this is not recommended by Google. The purpose of the file is to prevent overloading your site with requests, and should not be relied on as a mechanism for keeping pages out of the search results.