llms.txt, explained for WordPress owners.
llms.txt is a proposed standard for helping AI systems find and understand the valuable content on your website. It is to LLM crawlers what sitemap.xml was to search engines a decade ago — but simpler, and arguably more important.
What it is
llms.txt is a plain-text file you serve at the root of your domain — at /llms.txt — that describes your site for large language models. It was proposed by Jeremy Howard (founder of fast.ai) in late 2024. It is a community convention, not an IETF standard. Publishing-side adoption is broad — companies like Anthropic, Cloudflare, Stripe, and Vercel serve one at their root — but the crawlers that actually consume it have been mostly silent about it in public.
There are two related files in practice:
/llms.txt— a short manifest listing your key pages with descriptions/llms-full.txt— the actual content of your site, converted to clean Markdown
How it differs from robots.txt and sitemap.xml
robots.txt tells crawlers what they can and can't visit. sitemap.xml tells them where to find every page. llms.txt goes a step further: it tells LLMs why each page matters and gives them the content in a format that's cheap to read — no HTML parsing, no navigation clutter, no JavaScript required.
That last part is the quiet advantage. When an AI crawler visits a typical WordPress page, it spends tokens parsing through theme markup to find the actual content. With llms-full.txt, you hand it the content directly — and you do it in the format it's cheapest to consume.
What a minimal llms.txt looks like
# Acme Accounting
> Small-business accounting firm based in Stockholm.
> We specialize in Swedish bookkeeping, tax returns, and VAT.
## Services
- [Bookkeeping](/services/bookkeeping/): Monthly bookkeeping from 2,400 SEK/mo
- [Tax Returns](/services/tax/): Personal and corporate tax returns
- [VAT Registration](/services/vat/): EU VAT setup and compliance
## About
- [About Us](/about/): 12-year-old firm, 40+ clients
- [Contact](/contact/): Stockholm office details
Blockquote at the top is the site description. Each H2 groups pages. Each list item is a page with a one-line description. That's the entire spec.
What llms-full.txt looks like
Same structure, but with the full text of each page appended. So instead of a link to /services/bookkeeping/, you get the entire bookkeeping page rendered as Markdown — headings, paragraphs, FAQs, all of it. AI crawlers can index the whole file in one request.
Will AI crawlers actually respect it?
Honestly: nobody has said for certain, yet. No major AI vendor has published a formal commitment to consuming llms.txt as crawler behavior, and the ones that have spoken publicly have been deliberately vague. Some independent reports suggest retrieval systems do parse the files when they find them; others say the effect is indistinguishable from having the same content in well-structured HTML.
The honest case for publishing one anyway: the file costs nothing to serve and takes nothing from your site. If it helps with AI citation, good. If it does nothing, you've added a few kilobytes of plain text to your root. It's cheap insurance, not a guaranteed signal.
How to add llms.txt to WordPress
Three options:
- Write it by hand. Create a text file, upload to your site root via FTP. Update whenever you add or remove pages. Probably fine for a 10-page site, painful for anything larger.
- Use a code snippet. Register a rewrite rule and a template_redirect handler that outputs the file from your sitemap. Works if you're comfortable with
functions.php. - Use Dennis GEO. The plugin generates both
/llms.txtand/llms-full.txtautomatically from your sitemap and per-page descriptions. Re-generates on publish. No code.
Obviously I'd recommend option three, but options one and two are also legitimate.
Try it on your site
Visit https://yoursite.com/llms.txt and see if the file exists. If you get a 404, nothing is telling AI crawlers how to read your site.