llms.txt — Making Websites AI-Readable
llms.txt is a community standard for providing AI models with a clean, structured summary of your website or documentation. Think of it like robots.txt for search engine crawlers, but designed for large language models. The specification is maintained at llmstxt.org.
What problem does it solve?
Websites are designed for humans — they have navigation, footers, sidebars, JavaScript, and complex layouts that make it hard for AI models to extract the actual content. llms.txt gives models a single, clean entry point that lists what your site offers and links to the most important pages in markdown format.
How it works
Place an llms.txt file at your website root (e.g., example.com/llms.txt). It starts with a title and description, then lists key sections with links.
Format rules
- Start with an H1 heading (
# Site Name) - Follow with a blockquote description
- Use H2 sections to organize content areas
- List important pages as markdown links with brief descriptions
- Keep it concise — this is an overview, not a complete sitemap
- Optionally provide
llms-full.txtwith expanded content
When to use it
Add llms.txt if you want AI models to accurately understand and reference your website content. It’s especially useful for documentation sites, API references, developer tools, and any site where AI assistants might need to point users to the right resources.
Who supports it
llms.txt is a community-driven standard gaining adoption across the web. AI tools like Claude, ChatGPT, and Perplexity can use llms.txt files when provided as context. Documentation platforms like Mintlify, Docusaurus, and ReadMe are adding support for auto-generating llms.txt files.
Related specs
Official Specification
The llms.txt specification and examples are maintained at llmstxt.org.
Read the llms.txt specification