Robots.txt and llms.txt
Configure robots.txt and llms.txt for your Documentation.AI site, control indexing and following, and verify the published files.
Overview
Use robots.txt and llms.txt to control how crawlers and AI models discover and use your published documentation.
Documentation.AI exposes a robots.txt file at /robots.txt on your live docs site and includes robots directives for indexing and following links. You configure these directives from the SEO section in Site Config (in the Editor top toolbar) or directly in documentation.json.
llms.txt lives at /llms.txt on your live docs site and helps AI models understand how to crawl and use your documentation. You toggle generation of this file from the same SEO section in Site Config or in documentation.json.
robots.txt and llms.txt always reflect the last published version of your documentation. After any SEO configuration change, publish your docs to update the files on your live site.
Access and verify robots.txt and llms.txt
Use your live docs site to confirm that robots.txt and llms.txt are being served with the expected behavior.
Open your live docs site
-
Open the URL where your documentation is hosted (for example, your default Documentation.AI domain or custom domain).
-
Confirm that the site loads the latest published version of your docs.
Check robots.txt
-
Append
/robots.txtto your docs base URL, such ashttps://acme.com/robots.txt. -
Load the page in your browser and check that a plain-text file returns with HTTP 200 status.
-
Confirm that search engine testing tools (for example, Google Search Console) can also access
robots.txtat the same path.
Check llms.txt
-
Append
/llms.txtto your docs base URL, such ashttps://acme.com/llms.txt. -
Load the page in your browser and check that a plain-text file returns with HTTP 200 when
llms.txtis enabled. -
If you disable
llms.txt, verify that the file is no longer served or that AI tooling you use reflects the change after a republish.
If you use a custom domain or custom subpath, keep the same paths relative to the docs root. For example, if your docs live at https://acme.com/help, access robots.txt at https://acme.com/help/robots.txt and llms.txt at https://acme.com/help/llms.txt.
Enable or disable robots indexing and following
Control how search engines treat your docs by enabling or disabling indexing and link following. Documentation.AI stores these settings under seo.robots:index and seo.robots:follow in documentation.json and applies them after you publish.
Using the Editor
Use Site Config in the Editor when you want to change robots behavior without editing configuration files.
Open Site Config
In the Editor, click Site Config in the top toolbar, then select SEO.
Adjust robots directives
-
Turn indexing on or off to control whether search engines index your docs pages (maps to
seo.robots:index). -
Turn following on or off to control whether search engines follow links on your docs pages (maps to
seo.robots:follow).
Publish changes
-
Changes appear in the Changes panel alongside content edits.
-
Publish your documentation so that the updated robots configuration is included in the build.
-
After the publish completes, reload
/robots.txton your live site to confirm it reflects the new settings.
Site Config changes to robots directives only take effect after a new publish. If /robots.txt does not look updated, check that the latest publish finished successfully and that you are viewing the correct environment or domain.
Using documentation.json
Define robots behavior directly in documentation.json when you manage your docs in code or through version control.
{
"seo": {
"robots:index": true,
"robots:follow": true
}
}
In this example:
-
seo.robots:indexcontrols whether search engines index your documentation pages. -
seo.robots:followcontrols whether search engines follow links on your documentation pages.
Enable or disable llms.txt generation
Turn llms.txt support on or off for your documentation so that AI crawlers can discover and interpret your docs consistently.
Using the Editor
Use Site Config in the Editor to toggle llms.txt without editing files.
Open Site Config
In the Editor, click Site Config in the top toolbar, then select SEO.
Toggle llms.txt generation
-
Enable
llms.txtto generate and serve thellms.txtfile at/llms.txt. -
Disable
llms.txtif you do not want Documentation.AI to expose this file to AI crawlers.
Publish and verify
-
Changes appear in the Changes panel alongside content edits.
-
Publish your documentation so the
llms.txtchange is applied to your live site. -
Open
/llms.txton your docs domain to confirm that the file is served (when enabled) or no longer available (when disabled).
Using documentation.json
Control llms.txt generation alongside your other SEO configuration by setting seo.llms-txt:enabled in documentation.json.
{
"seo": {
"llms-txt:enabled": true
}
}
In this example:
-
Setting
seo.llms-txt:enabledtotruetells Documentation.AI to generate and serve/llms.txtfor your published docs. -
Setting
seo.llms-txt:enabledtofalsedisables thellms.txtfile.
Last updated 3 days ago
Built with Documentation.AI