Tutorials·Apr 26, 2026·8 min read

How to Add llms.txt to Your WordPress Site

Why llms.txt matters for AI crawlers and how to implement it safely.

By Devebo

llms.txt is an emerging convention for communicating AI crawler preferences and content boundaries.

It is not a ranking shortcut, and it does not replace technical SEO fundamentals.
It is a governance file: clear machine-readable guidance for LLM-oriented crawlers.

Why teams are adding it

  • AI answer engines increasingly consume web content outside classic search UX.
  • Legal and policy teams want explicit crawler guidance.
  • Technical SEO teams want consistent implementation standards across client fleets.

What to include

A minimal llms.txt usually contains:

  • Site identity
  • Allowed/disallowed paths for AI crawl contexts
  • Policy references
  • Contact route for crawler issues

WordPress implementation options

Option 1: Managed through SEO Merlin fix flow

Use the dedicated check + fix path so file generation is tracked and rollback-aware.

Option 2: Manual file deployment

Place llms.txt in web root and maintain it through your deployment pipeline.

This is suitable for teams with strict GitOps controls.

Validation workflow

  1. Verify file resolves at https://yourdomain.com/llms.txt.
  2. Confirm response is HTTP 200 and text/plain.
  3. Re-run technical checks after deployment.
  4. Keep policy text synchronized with legal/privacy pages.

Common mistakes

  • Treating llms.txt as a substitute for robots.txt.
  • Shipping policy text that conflicts with published legal terms.
  • Adding over-broad disallow rules without business alignment.

Agency rollout recommendation

Pilot on 2-3 lower-risk client properties first, validate no crawl anomalies, then roll out via template policy docs.

Try it

Run a free audit on your site.