LLMs.txt Explained

Your guide to the web’s new LLM-ready content standard

You might’ve seen various dev tools adding LLMs.txt support to their docs recently. This proposed web standard is quickly gaining adoption, but what is it exactly and why does it matter?

While robots.txt and sitemap.xml are designed for search engines, LLMs.txt is optimized for reasoning engines. It provides information about a website to LLMs in a format they can easily understand.

So, how did LLMs.txt go from proposal to industry trend practically overnight?

LLMs.txt Explained (Photo by Jørgen Larsen on Unsplash)

How Mintlify Popularized LLMs.txt

On November 14th, Mintlify added LLMs.txt support to their docs platform. In one move, they made thousands of dev tools’ docs LLM-friendly, like Anthropic and Cursor.

Anthropic and others quickly posted on X about their LLMs.txt support. More Mintlify-hosted docs joined in, creating a wave of visibility for the proposed standard.

The momentum sparked new community sites and tools. @ifox created directory.llmstxt.cloud to index LLM-friendly technical docs. @screenfluent followed shortly with llmstxt.directory.

Mot, who made dotenvx, built and shared an open-source generator tool for dotenvx’s docs site. Eric Ciarla of Firecrawl created a tool that scrapes your website and creates the file for you.

https://medium.com/media/e5ff953210c1bef1ce017e20cc03c9c8/href

Who created LLMs.txt and why?

Jeremy Howard, co-founder of Answer.AI, proposed LLMs.txt to solve a specific technical challenge.

AI systems can only process limited context windows, making it difficult for them to understand large documentation sites. Traditional SEO techniques are optimized for search crawlers rather than reasoning engines, and so they can’t solve this limitation.

When AI systems try to process HTML pages directly, they get bogged down with navigation elements, JavaScript, CSS, and other non-essential info that reduces the space available for actual content.

LLMs.txt solves that by giving the AI the exact information it needs in a format it understands.

https://medium.com/media/146eab1644f030da2a62b2c63302bbb7/href

What exactly is an LLMs.txt file?

LLMs.txt is a markdown file with a specific structure. The specification defines two distinct files:

/llms.txt: A streamlined view of your documentation navigation to help AI systems quickly understand your site’s structure/llms-full.txt: A comprehensive file containing all your documentation in one place

/llms.txt

The file must start with an H1 project name, followed by a blockquote summary. Subsequent sections use H2 headers to organize documentation links. The “Optional” section specifically marking less critical resources.

# Project Name
> Brief project summary

Additional context and important notes

## Core Documentation
– [Quick Start](url): Description of the resource
– [API Reference](url): API documentation details

## Optional
– [Additional Resources](url): Supplementary information

For a simple example, see llmtxt.org’s own LLM.txt. For an in-depth, multi-language example, see Anthropic’s.

/llms-full.txt

While /llms.txt provides navigation and structure, /llms-full.txt contains the complete documentation content in markdown.

# AI Review (Beta)

AI Review is a feature that allows you to review your recent changes in your codebase to catch any potential bugs.

<Frame>
<img src=”https://mintlify.s3-us-west-1.amazonaws.com/cursor/images/advanced/review.png” alt=”AI Review” />
</Frame>

You can click into individual review items to see the full context in the editor, and chat with the AI to get more information.

### Custom Review Instructions

In order for AI Review to work in your favor, you can provide custom instructions for the AI to focus on. For example,
if you want the AI to focus on performance-related issues, you could put:

“`
focus on the performance of my code
“`

This way, AI Review will focus on the performance of your code when scanning through your changes.

### Review Options

Currently, you have a several options to choose from to review:

* `Review Working State`
* This will review your uncommitted changes.
* `Review Diff with Main Branch`
* This will review the diff between your current working state and the main branch.
* `Review Last Commit`
* This will review the last commit you made.

The above snippet is from Cursor’s /llms-full.txt file. See the full file on Cursor’s docs.

LLMs.txt vs sitemap.xml vs robots.txt

It serves a fundamentally different purpose than existing web standards like sitemap.xml and robots.txt.

/sitemap.xml lists all indexable pages, but doesn’t help with content processing. AI systems would still need to parse complex HTML and handle extra info, cluttering up the context window.

/robots.txt suggests search engine crawler access, but doesn’t assist with content understanding either.

/llms.txt solves AI-related challenges. It helps overcome context window limitations, removes non-essential markup and scripts, and presents content in a structure optimized for AI processing.

How to use LLMs.txt with AI systems

Unlike search engines that actively crawl the web, current LLMs don’t automatically discover and index LLMs.txt files.

You must manually provide the file content to your AI system. This can be done by pasting the link, copying the file contents directly into your prompt, or using the AI tool’s file upload feature.

ChatGPT

First, go to that docs’ or /llms-full.txt URL. Copy the contents or URL into your chat. Ask specific questions about what you’d like to accomplish.

A screenshot of using an llms-full.txt file with ChatGPT (Image by author).

Claude

Claude can’t yet browse the web, so copy the contents of that docs’ /llms-full.txt file into your clipboard. Alternatively, you can save it as a .txt file and upload it. Now you can ask any questions you like confident that it has the full, most up-to-date context.

A screenshot of using an llms-full.txt file with Claude (Image by author).

Cursor

Cursor lets you add and index third party docs and use them as context in your chats. You can do this by typing @Docs > Add new doc. A modal will appear and it’s here where you can add a link to the /llms-full.txt file. You will be able to use it as context like any other doc.

To learn more about this feature see Cursor’s @Docs feature.

A screenshot of inputting a llms-full.txt file into Cursor to use as context (Image by author).

How to generate LLMs.txt files

There are several different tools you can use to create your own:

Mintlify: Automatically generates both /llms.txt and /llms-full.txt for hosted documentationllmstxt by dotenv: A tool by dotenvx’s creator Mot that generates llms.txt using your site’s sitemap.xml.llmstxt by Firecrawl: A different tool by Firecrawl’s founder, Eric Ciarla, that scrapes your website using Firecrawl to generate the llms.txt file.

What’s next for LLMs.txt?

LLMs.txt represents a shift toward AI-first documentation.

Just as SEO became essential for search visibility, having AI-readable content will become crucial for dev tools and docs.

As more sites adopt this file, we’ll likely see new tools and best practices emerge for making content accessible to both humans and AI assistants.

For now, LLMs.txt offers a practical solution to help AI systems better understand and utilize web content, particularly for technical documentation and APIs.

LLMs.txt Explained was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.

Author:

Leave a Comment

You must be logged in to post a comment.