What is robots.txt
Robots.txt is the classic access-control protocol. It dictates strictly what automated spiders are allowed to crawl, acting as a firewall mapping.
What is llms.txt
LLMs.txt is a knowledge-topology map. It does not control access; it provides a direct summarization payload to LLMs so they know contextually what the site is about.
Key differences
- Purpose: Robots restricts, LLMs informs.
- Format: Robots uses `Allow/Disallow` syntax. LLMs uses standard Markdown.
- Utility: Robots is for all crawlers. LLMs is uniquely for Generative AI engines.
When to use both
Always. Serve robots.txt to secure your private nodes, and serve llms.txt to push your public knowledge nodes into AI models.